WorldWideScience

Sample records for hierarchical bayesian approaches

  1. A Comparison of Hierarchical and Non-Hierarchical Bayesian Approaches for Fitting Allometric Larch (Larix.spp. Biomass Equations

    Directory of Open Access Journals (Sweden)

    Dongsheng Chen

    2016-01-01

    Full Text Available Accurate biomass estimations are important for assessing and monitoring forest carbon storage. Bayesian theory has been widely applied to tree biomass models. Recently, a hierarchical Bayesian approach has received increasing attention for improving biomass models. In this study, tree biomass data were obtained by sampling 310 trees from 209 permanent sample plots from larch plantations in six regions across China. Non-hierarchical and hierarchical Bayesian approaches were used to model allometric biomass equations. We found that the total, root, stem wood, stem bark, branch and foliage biomass model relationships were statistically significant (p-values < 0.001 for both the non-hierarchical and hierarchical Bayesian approaches, but the hierarchical Bayesian approach increased the goodness-of-fit statistics over the non-hierarchical Bayesian approach. The R2 values of the hierarchical approach were higher than those of the non-hierarchical approach by 0.008, 0.018, 0.020, 0.003, 0.088 and 0.116 for the total tree, root, stem wood, stem bark, branch and foliage models, respectively. The hierarchical Bayesian approach significantly improved the accuracy of the biomass model (except for the stem bark and can reflect regional differences by using random parameters to improve the regional scale model accuracy.

  2. Poor-data and data-poor species stock assessment using a Bayesian hierarchical approach.

    Science.gov (United States)

    Jiao, Yan; Cortés, Enric; Andrews, Kate; Guo, Feng

    2011-10-01

    Appropriate inference for stocks or species with low-quality data (poor data) or limited data (data poor) is extremely important. Hierarchical Bayesian methods are especially applicable to small-area, small-sample-size estimation problems because they allow poor-data species to borrow strength from species with good-quality data. We used a hammerhead shark complex as an example to investigate the advantages of using hierarchical Bayesian models in assessing the status of poor-data and data-poor exploited species. The hammerhead shark complex (Sphyrna spp.) along the Atlantic and Gulf of Mexico coasts of the United States is composed of three species: the scalloped hammerhead (S. lewini), the great hammerhead (S. mokarran), and the smooth hammerhead (S. zygaena) sharks. The scalloped hammerhead comprises 70-80% of the catch and has catch and relative abundance data of good quality, whereas great and smooth hammerheads have relative abundance indices that are both limited and of low quality presumably because of low stock density and limited sampling. Four hierarchical Bayesian state-space surplus production models were developed to simulate variability in population growth rates, carrying capacity, and catchability of the species. The results from the hierarchical Bayesian models were considerably more robust than those of the nonhierarchical models. The hierarchical Bayesian approach represents an intermediate strategy between traditional models that assume different population parameters for each species and those that assume all species share identical parameters. Use of the hierarchical Bayesian approach is suggested for future hammerhead shark stock assessments and for modeling fish complexes with species-specific data, because the poor-data species can borrow strength from the species with good data, making the estimation more stable and robust.

  3. Prediction of road accidents: A Bayesian hierarchical approach

    DEFF Research Database (Denmark)

    Deublein, Markus; Schubert, Matthias; Adey, Bryan T.

    2013-01-01

    In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson...... of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions...... of the model response variables, conditional on the values of the risk indicating variables.The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict...

  4. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2011-01-01

    surface, and electrode positions. We first present a hierarchical Bayesian framework for EEG source localization that jointly performs source and forward model reconstruction (SOFOMORE). Secondly, we evaluate the SOFOMORE approach by comparison with source reconstruction methods that use fixed forward......We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical...... models. Analysis of simulated and real EEG data provide evidence that reconstruction of the forward model leads to improved source estimates....

  5. An approach based on Hierarchical Bayesian Graphical Models for measurement interpretation under uncertainty

    Science.gov (United States)

    Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter

    2017-02-01

    It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.

  6. Poor-data and data-poor species stock assessment using a Bayesian hierarchical approach

    OpenAIRE

    Jiao, Y.; Cortes, E; Andrews, K.; Guo, F.

    2011-01-01

    Appropriate inference for stocks or species with low-quality data (poor data) or limited data (data poor) is extremely important. Hierarchical Bayesian methods are especially applicable to small-area, small-sample-size estimation problems because they allow poor-data species to borrow strength from species with good-quality data. We used a hammerhead shark complex as an example to investigate the advantages of using hierarchical Bayesian models in assessing the status of poor-data and data-po...

  7. Multimethod, multistate Bayesian hierarchical modeling approach for use in regional monitoring of wolves.

    Science.gov (United States)

    Jiménez, José; García, Emilio J; Llaneza, Luis; Palacios, Vicente; González, Luis Mariano; García-Domínguez, Francisco; Múñoz-Igualada, Jaime; López-Bao, José Vicente

    2016-08-01

    In many cases, the first step in large-carnivore management is to obtain objective, reliable, and cost-effective estimates of population parameters through procedures that are reproducible over time. However, monitoring predators over large areas is difficult, and the data have a high level of uncertainty. We devised a practical multimethod and multistate modeling approach based on Bayesian hierarchical-site-occupancy models that combined multiple survey methods to estimate different population states for use in monitoring large predators at a regional scale. We used wolves (Canis lupus) as our model species and generated reliable estimates of the number of sites with wolf reproduction (presence of pups). We used 2 wolf data sets from Spain (Western Galicia in 2013 and Asturias in 2004) to test the approach. Based on howling surveys, the naïve estimation (i.e., estimate based only on observations) of the number of sites with reproduction was 9 and 25 sites in Western Galicia and Asturias, respectively. Our model showed 33.4 (SD 9.6) and 34.4 (3.9) sites with wolf reproduction, respectively. The number of occupied sites with wolf reproduction was 0.67 (SD 0.19) and 0.76 (0.11), respectively. This approach can be used to design more cost-effective monitoring programs (i.e., to define the sampling effort needed per site). Our approach should inspire well-coordinated surveys across multiple administrative borders and populations and lead to improved decision making for management of large carnivores on a landscape level. The use of this Bayesian framework provides a simple way to visualize the degree of uncertainty around population-parameter estimates and thus provides managers and stakeholders an intuitive approach to interpreting monitoring results. Our approach can be widely applied to large spatial scales in wildlife monitoring where detection probabilities differ between population states and where several methods are being used to estimate different population

  8. A Hierarchical Multivariate Bayesian Approach to Ensemble Model output Statistics in Atmospheric Prediction

    Science.gov (United States)

    2017-09-01

    from this PDF . EMOS models also use multiple linear regression to characterize the sensitivity of a univariate weather quantity—that is, the...classical least-squares approach to multivariate multiple linear regression using both measures-oriented and distributions-oriented scoring rules...14. SUBJECT TERMS ensemble model output statistics, statistical post-processing, multivariate multiple linear regression, Bayesian data analysis

  9. Hierarchical Bayesian approach for estimating physical properties in spiral galaxies: Age Maps for M74

    Science.gov (United States)

    Sánchez Gil, M. Carmen; Berihuete, Angel; Alfaro, Emilio J.; Pérez, Enrique; Sarro, Luis M.

    2015-09-01

    One of the fundamental goals of modern Astronomy is to estimate the physical parameters of galaxies from images in different spectral bands. We present a hierarchical Bayesian model for obtaining age maps from images in the Ha line (taken with Taurus Tunable Filter (TTF)), ultraviolet band (far UV or FUV, from GALEX) and infrared bands (24, 70 and 160 microns (μm), from Spitzer). As shown in [1], we present the burst ages for young stellar populations in the nearby and nearly face on galaxy M74. As it is shown in the previous work, the Hα to FUV flux ratio gives a good relative indicator of very recent star formation history (SFH). As a nascent star-forming region evolves, the Ha line emission declines earlier than the UV continuum, leading to a decrease in the HαFUV ratio. Through a specific star-forming galaxy model (Starburst 99, SB99), we can obtain the corresponding theoretical ratio Hα / FUV to compare with our observed flux ratios, and thus to estimate the ages of the observed regions. Due to the nature of the problem, it is necessary to propose a model of high complexity to take into account the mean uncertainties, and the interrelationship between parameters when the Hα / FUV flux ratio mentioned above is obtained. To address the complexity of the model, we propose a Bayesian hierarchical model, where a joint probability distribution is defined to determine the parameters (age, metallicity, IMF), from the observed data, in this case the observed flux ratios Hα / FUV. The joint distribution of the parameters is described through an i.i.d. (independent and identically distributed random variables), generated through MCMC (Markov Chain Monte Carlo) techniques.

  10. Modeling age and nest-specific survival using a hierarchical Bayesian approach.

    Science.gov (United States)

    Cao, Jing; He, Chong Z; Suedkamp Wells, Kimberly M; Millspaugh, Joshua J; Ryan, Mark R

    2009-12-01

    Recent studies have shown that grassland birds are declining more rapidly than any other group of terrestrial birds. Current methods of estimating avian age-specific nest survival rates require knowing the ages of nests, assuming homogeneous nests in terms of nest survival rates, or treating the hazard function as a piecewise step function. In this article, we propose a Bayesian hierarchical model with nest-specific covariates to estimate age-specific daily survival probabilities without the above requirements. The model provides a smooth estimate of the nest survival curve and identifies the factors that are related to the nest survival. The model can handle irregular visiting schedules and it has the least restrictive assumptions compared to existing methods. Without assuming proportional hazards, we use a multinomial semiparametric logit model to specify a direct relation between age-specific nest failure probability and nest-specific covariates. An intrinsic autoregressive prior is employed for the nest age effect. This nonparametric prior provides a more flexible alternative to the parametric assumptions. The Bayesian computation is efficient because the full conditional posterior distributions either have closed forms or are log concave. We use the method to analyze a Missouri dickcissel dataset and find that (1) nest survival is not homogeneous during the nesting period, and it reaches its lowest at the transition from incubation to nestling; and (2) nest survival is related to grass cover and vegetation height in the study area.

  11. A new hierarchical Bayesian approach to analyse environmental and climatic influences on debris flow occurrence

    Science.gov (United States)

    Jomelli, Vincent; Pavlova, Irina; Eckert, Nicolas; Grancher, Delphine; Brunstein, Daniel

    2015-12-01

    How can debris flow occurrences be modelled at regional scale and take both environmental and climatic conditions into account? And, of the two, which has the most influence on debris flow activity? In this paper, we try to answer these questions with an innovative Bayesian hierarchical probabilistic model that simultaneously accounts for how debris flows respond to environmental and climatic variables. In it, full decomposition of space and time effects in occurrence probabilities is assumed, revealing an environmental and a climatic trend shared by all years/catchments, respectively, clearly distinguished from residual "random" effects. The resulting regional and annual occurrence probabilities evaluated as functions of the covariates make it possible to weight the respective contribution of the different terms and, more generally, to check the model performances at different spatio-temporal scales. After suitable validation, the model can be used to make predictions at undocumented sites and could be used in further studies for predictions under future climate conditions. Also, the Bayesian paradigm easily copes with missing data, thus making it possible to account for events that may have been missed during surveys. As a case study, we extract 124 debris flow event triggered between 1970 and 2005 in 27 catchments located in the French Alps from the French national natural hazard survey and model their variability of occurrence considering environmental and climatic predictors at the same time. We document the environmental characteristics of each debris flow catchment (morphometry, lithology, land cover, and the presence of permafrost). We also compute 15 climate variables including mean temperature and precipitation between May and October and the number of rainy days with daily cumulative rainfall greater than 10/15/20/25/30/40 mm day- 1. Application of our model shows that the combination of environmental and climatic predictors explained 77% of the overall

  12. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  13. An Approach to Structure Determination and Estimation of Hierarchical Archimedean Copulas and its Application to Bayesian Classification

    Czech Academy of Sciences Publication Activity Database

    Górecki, J.; Hofert, M.; Holeňa, Martin

    2016-01-01

    Roč. 46, č. 1 (2016), s. 21-59 ISSN 0925-9902 R&D Projects: GA ČR GA13-17187S Grant - others:Slezská univerzita v Opavě(CZ) SGS/21/2014 Institutional support: RVO:67985807 Keywords : Copula * Hierarchical archimedean copula * Copula estimation * Structure determination * Kendall’s tau * Bayesian classification Subject RIV: IN - Informatics, Computer Science Impact factor: 1.294, year: 2016

  14. Estimation of Mental Disorders Prevalence in High School Students Using Small Area Methods: A Hierarchical Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Ali Reza Soltanian

    2016-08-01

    Full Text Available Background Adolescence is one of the most important periods in the course of human evolution and the prevalence of mental disorders among adolescence in different regions of Iran, especially in southern Iran. Objectives This study was conducted to determine the prevalence of mental disorders among high school students in Bushehr province, south of Iran. Methods In this cross-sectional study, 286 high school students were recruited by a multi-stage random sampling in Bushehr province in 2015. A general health questionnaire (GHQ-28 was used to assess mental disorders. The small area method, under the hierarchical Bayesian approach, was used to determine the prevalence of mental disorders and data analysis. Results From 286 questionnaires only 182 were completely filed and evaluated (the response rate was 70.5%. Of the students, 58.79% and 41.21% were male and female, respectively. Of all students, the prevalence of mental disorders in Bushehr, Dayyer, Deylam, Kangan, Dashtestan, Tangestan, Genaveh, and Dashty were 0.48, 0.42, 0.45, 0.52, 0.41, 0.47, 0.42, and 0.43, respectively. Conclusions Based on this study, the prevalence of mental disorders among adolescents was increasing in Bushehr Province counties. The lack of a national policy in this way is a serious obstacle to mental health and wellbeing access.

  15. Differential Gene Expression (DEX) and Alternative Splicing Events (ASE) for Temporal Dynamic Processes Using HMMs and Hierarchical Bayesian Modeling Approaches.

    Science.gov (United States)

    Oh, Sunghee; Song, Seongho

    2017-01-01

    In gene expression profile, data analysis pipeline is categorized into four levels, major downstream tasks, i.e., (1) identification of differential expression; (2) clustering co-expression patterns; (3) classification of subtypes of samples; and (4) detection of genetic regulatory networks, are performed posterior to preprocessing procedure such as normalization techniques. To be more specific, temporal dynamic gene expression data has its inherent feature, namely, two neighboring time points (previous and current state) are highly correlated with each other, compared to static expression data which samples are assumed as independent individuals. In this chapter, we demonstrate how HMMs and hierarchical Bayesian modeling methods capture the horizontal time dependency structures in time series expression profiles by focusing on the identification of differential expression. In addition, those differential expression genes and transcript variant isoforms over time detected in core prerequisite steps can be generally further applied in detection of genetic regulatory networks to comprehensively uncover dynamic repertoires in the aspects of system biology as the coupled framework.

  16. A Hierarchical Bayesian Approach for Combining Pharmacokinetic/Pharmacodynamic Modeling and Phase IIa Trial Design in Orphan Drugs: Treating Adrenoleukodystrophy with Lorenzo’s Oil

    Science.gov (United States)

    Basu, Cynthia; Ahmed, Mariam A.; Kartha, Reena V.; Brundage, Richard C.; Raymond, Gerald V.; Cloyd, James C.; Carlin, Bradley P.

    2017-01-01

    X-linked adrenoleukodystrophy (X-ALD) is a rare, progressive and typically fatal neurodegenerative disease. Lorenzo’s Oil (LO) is one of the few X-ALD treatments available, but little has been done to establish its clinical efficacy or indications for its use. In this paper, we analyze data on 116 male asymptomatic pediatric patients who were administered LO. We offer a hierarchical Bayesian statistical approach to understanding LO pharmacokinetics (PK) and pharmacodynamics (PD) resulting from an accumulation of very long chain fatty acids. We experiment with individual- and observational-level errors, various choices of prior distributions, and deal with the limitation of having just one observation per administration of the drug, as opposed to the more usual multiple observations per administration. We link LO dose to the plasma erucic acid concentrations by PK modeling, and then link this concentration to a biomarker (C26, a very long chain fatty acid) by PD modeling. Next, we design a Bayesian Phase IIa study to estimate precisely what improvements in the biomarker can arise from various LO doses, while simultaneously modeling a binary toxicity endpoint. Our Bayesian adaptive algorithm emerges as reasonably robust and efficient while still retaining good classical (frequentist) operating characteristics. Future work looks toward using the results of this trial to design a Phase III study linking LO dose to actual improvements in health status, as measured by the appearance of brain lesions observed via magnetic resonance imaging. PMID:27547896

  17. A hierarchical Bayesian approach for combining pharmacokinetic/pharmacodynamic modeling and Phase IIa trial design in orphan drugs: Treating adrenoleukodystrophy with Lorenzo's oil.

    Science.gov (United States)

    Basu, Cynthia; Ahmed, Mariam A; Kartha, Reena V; Brundage, Richard C; Raymond, Gerald V; Cloyd, James C; Carlin, Bradley P

    2016-01-01

    X-linked adrenoleukodystrophy (X-ALD) is a rare, progressive, and typically fatal neurodegenerative disease. Lorenzo's oil (LO) is one of the few X-ALD treatments available, but little has been done to establish its clinical efficacy or indications for its use. In this article, we analyze data on 116 male asymptomatic pediatric patients who were administered LO. We offer a hierarchical Bayesian statistical approach to understand LO pharmacokinetics (PK) and pharmacodynamics (PD) resulting from an accumulation of very long-chain fatty acids. We experiment with individual- and observational-level errors and various choices of prior distributions and deal with the limitation of having just one observation per administration of the drug, as opposed to the more usual multiple observations per administration. We link LO dose to the plasma erucic acid concentrations by PK modeling, and then link this concentration to a biomarker (C26, a very long-chain fatty acid) by PD modeling. Next, we design a Bayesian Phase IIa study to estimate precisely what improvements in the biomarker can arise from various LO doses while simultaneously modeling a binary toxicity endpoint. Our Bayesian adaptive algorithm emerges as reasonably robust and efficient while still retaining good classical (frequentist) operating characteristics. Future work looks toward using the results of this trial to design a Phase III study linking LO dose to actual improvements in health status, as measured by the appearance of brain lesions observed via magnetic resonance imaging.

  18. A Bayesian hierarchical model for climate change detection and attribution

    Science.gov (United States)

    Katzfuss, Matthias; Hammerling, Dorit; Smith, Richard L.

    2017-06-01

    Regression-based detection and attribution methods continue to take a central role in the study of climate change and its causes. Here we propose a novel Bayesian hierarchical approach to this problem, which allows us to address several open methodological questions. Specifically, we take into account the uncertainties in the true temperature change due to imperfect measurements, the uncertainty in the true climate signal under different forcing scenarios due to the availability of only a small number of climate model simulations, and the uncertainty associated with estimating the climate variability covariance matrix, including the truncation of the number of empirical orthogonal functions (EOFs) in this covariance matrix. We apply Bayesian model averaging to assign optimal probabilistic weights to different possible truncations and incorporate all uncertainties into the inference on the regression coefficients. We provide an efficient implementation of our method in a software package and illustrate its use with a realistic application.

  19. A novel approach to quantifying the sensitivity of current and future cosmological datasets to the neutrino mass ordering through Bayesian hierarchical modeling

    Science.gov (United States)

    Gerbino, Martina; Lattanzi, Massimiliano; Mena, Olga; Freese, Katherine

    2017-12-01

    We present a novel approach to derive constraints on neutrino masses, as well as on other cosmological parameters, from cosmological data, while taking into account our ignorance of the neutrino mass ordering. We derive constraints from a combination of current as well as future cosmological datasets on the total neutrino mass Mν and on the mass fractions fν,i =mi /Mν (where the index i = 1 , 2 , 3 indicates the three mass eigenstates) carried by each of the mass eigenstates mi, after marginalizing over the (unknown) neutrino mass ordering, either normal ordering (NH) or inverted ordering (IH). The bounds on all the cosmological parameters, including those on the total neutrino mass, take therefore into account the uncertainty related to our ignorance of the mass hierarchy that is actually realized in nature. This novel approach is carried out in the framework of Bayesian analysis of a typical hierarchical problem, where the distribution of the parameters of the model depends on further parameters, the hyperparameters. In this context, the choice of the neutrino mass ordering is modeled via the discrete hyperparameterhtype, which we introduce in the usual Markov chain analysis. The preference from cosmological data for either the NH or the IH scenarios is then simply encoded in the posterior distribution of the hyperparameter itself. Current cosmic microwave background (CMB) measurements assign equal odds to the two hierarchies, and are thus unable to distinguish between them. However, after the addition of baryon acoustic oscillation (BAO) measurements, a weak preference for the normal hierarchical scenario appears, with odds of 4 : 3 from Planck temperature and large-scale polarization in combination with BAO (3 : 2 if small-scale polarization is also included). Concerning next-generation cosmological experiments, forecasts suggest that the combination of upcoming CMB (COrE) and BAO surveys (DESI) may determine the neutrino mass hierarchy at a high statistical

  20. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  1. Chemical purity using quantitative 1H-nuclear magnetic resonance: a hierarchical Bayesian approach for traceable calibrations.

    Science.gov (United States)

    Toman, Blaza; Nelson, Michael A; Lippa, Katrice A

    2016-01-01

    Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.

  2. Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation

    Science.gov (United States)

    Froyen, Vicky; Feldman, Jacob; Singh, Manish

    2015-01-01

    We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548

  3. Ozone and childhood respiratory disease in three US cities: evaluation of effect measure modification by neighborhood socioeconomic status using a Bayesian hierarchical approach.

    Science.gov (United States)

    O' Lenick, Cassandra R; Chang, Howard H; Kramer, Michael R; Winquist, Andrea; Mulholland, James A; Friberg, Mariel D; Sarnat, Stefanie Ebelt

    2017-04-05

    Ground-level ozone is a potent airway irritant and a determinant of respiratory morbidity. Susceptibility to the health effects of ambient ozone may be influenced by both intrinsic and extrinsic factors, such as neighborhood socioeconomic status (SES). Questions remain regarding the manner and extent that factors such as SES influence ozone-related health effects, particularly across different study areas. Using a 2-stage modeling approach we evaluated neighborhood SES as a modifier of ozone-related pediatric respiratory morbidity in Atlanta, Dallas, & St. Louis. We acquired multi-year data on emergency department (ED) visits among 5-18 year olds with a primary diagnosis of respiratory disease in each city. Daily concentrations of 8-h maximum ambient ozone were estimated for all ZIP Code Tabulation Areas (ZCTA) in each city by fusing observed concentration data from available network monitors with simulations from an emissions-based chemical transport model. In the first stage, we used conditional logistic regression to estimate ZCTA-specific odds ratios (OR) between ozone and respiratory ED visits, controlling for temporal trends and meteorology. In the second stage, we combined ZCTA-level estimates in a Bayesian hierarchical model to assess overall associations and effect modification by neighborhood SES considering categorical and continuous SES indicators (e.g., ZCTA-specific levels of poverty). We estimated ORs and 95% posterior intervals (PI) for a 25 ppb increase in ozone. The hierarchical model combined effect estimates from 179 ZCTAs in Atlanta, 205 ZCTAs in Dallas, and 151 ZCTAs in St. Louis. The strongest overall association of ozone and pediatric respiratory disease was in Atlanta (OR = 1.08, 95% PI: 1.06, 1.11), followed by Dallas (OR = 1.04, 95% PI: 1.01, 1.07) and St. Louis (OR = 1.03, 95% PI: 0.99, 1.07). Patterns of association across levels of neighborhood SES in each city suggested stronger ORs in low compared to high SES areas, with

  4. Dynamic networks from hierarchical bayesian graph clustering.

    Directory of Open Access Journals (Sweden)

    Yongjin Park

    Full Text Available Biological networks change dynamically as protein components are synthesized and degraded. Understanding the time-dependence and, in a multicellular organism, tissue-dependence of a network leads to insight beyond a view that collapses time-varying interactions into a single static map. Conventional algorithms are limited to analyzing evolving networks by reducing them to a series of unrelated snapshots.Here we introduce an approach that groups proteins according to shared interaction patterns through a dynamical hierarchical stochastic block model. Protein membership in a block is permitted to evolve as interaction patterns shift over time and space, representing the spatial organization of cell types in a multicellular organism. The spatiotemporal evolution of the protein components are inferred from transcript profiles, using Arabidopsis root development (5 tissues, 3 temporal stages as an example.The new model requires essentially no parameter tuning, out-performs existing snapshot-based methods, identifies protein modules recruited to specific cell types and developmental stages, and could have broad application to social networks and other similar dynamic systems.

  5. Inferring land use and land cover impact on stream water quality using a Bayesian hierarchical modeling approach in the Xitiaoxi River Watershed, China.

    Science.gov (United States)

    Wan, Rongrong; Cai, Shanshan; Li, Hengpeng; Yang, Guishan; Li, Zhaofu; Nie, Xiaofei

    2014-01-15

    Lake eutrophication has become a very serious environmental problem in China. If water pollution is to be controlled and ultimately eliminated, it is essential to understand how human activities affect surface water quality. A recently developed technique using the Bayesian hierarchical linear regression model revealed the effects of land use and land cover (LULC) on stream water quality at a watershed scale. Six LULC categories combined with watershed characteristics, including size, slope, and permeability were the variables that were studied. The pollutants of concern were nutrient concentrations of total nitrogen (TN) and total phosphorus (TP), common pollutants found in eutrophication. The monthly monitoring data at 41 sites in the Xitiaoxi Watershed, China during 2009-2010 were used for model demonstration. The results showed that the relationships between LULC and stream water quality are so complicated that the effects are varied over large areas. The models suggested that urban and agricultural land are important sources of TN and TP concentrations, while rural residential land is one of the major sources of TN. Certain agricultural practices (excessive fertilizer application) result in greater concentrations of nutrients in paddy fields, artificial grasslands, and artificial woodlands. This study suggests that Bayesian hierarchical modeling is a powerful tool for examining the complicated relationships between land use and water quality on different scales, and for developing land use and water management policies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Hierarchical Bayesian modeling of the space - time diffusion patterns of cholera epidemic in Kumasi, Ghana

    NARCIS (Netherlands)

    Osei, Frank B.; Osei, F.B.; Duker, Alfred A.; Stein, A.

    2011-01-01

    This study analyses the joint effects of the two transmission routes of cholera on the space-time diffusion dynamics. Statistical models are developed and presented to investigate the transmission network routes of cholera diffusion. A hierarchical Bayesian modelling approach is employed for a joint

  7. Application of Bayesian Hierarchical Prior Modeling to Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Shutin, Dmitriy

    2012-01-01

    terms have proven to have strong sparsity-inducing properties. In this work, we design pilot assisted channel estimators for OFDM wireless receivers within the framework of sparse Bayesian learning by defining hierarchical Bayesian prior models that lead to sparsity-inducing penalization terms......Existing methods for sparse channel estimation typically provide an estimate computed as the solution maximizing an objective function defined as the sum of the log-likelihood function and a penalization term proportional to the l1-norm of the parameter of interest. However, other penalization....... The estimators result as an application of the variational message-passing algorithm on the factor graph representing the signal model extended with the hierarchical prior models. Numerical results demonstrate the superior performance of our channel estimators as compared to traditional and state...

  8. Topics in Bayesian Hierarchical Modeling and its Monte Carlo Computations

    Science.gov (United States)

    Tak, Hyung Suk

    The first chapter addresses a Beta-Binomial-Logit model that is a Beta-Binomial conjugate hierarchical model with covariate information incorporated via a logistic regression. Various researchers in the literature have unknowingly used improper posterior distributions or have given incorrect statements about posterior propriety because checking posterior propriety can be challenging due to the complicated functional form of a Beta-Binomial-Logit model. We derive data-dependent necessary and sufficient conditions for posterior propriety within a class of hyper-prior distributions that encompass those used in previous studies. Frequency coverage properties of several hyper-prior distributions are also investigated to see when and whether Bayesian interval estimates of random effects meet their nominal confidence levels. The second chapter deals with a time delay estimation problem in astrophysics. When the gravitational field of an intervening galaxy between a quasar and the Earth is strong enough to split light into two or more images, the time delay is defined as the difference between their travel times. The time delay can be used to constrain cosmological parameters and can be inferred from the time series of brightness data of each image. To estimate the time delay, we construct a Gaussian hierarchical model based on a state-space representation for irregularly observed time series generated by a latent continuous-time Ornstein-Uhlenbeck process. Our Bayesian approach jointly infers model parameters via a Gibbs sampler. We also introduce a profile likelihood of the time delay as an approximation of its marginal posterior distribution. The last chapter specifies a repelling-attracting Metropolis algorithm, a new Markov chain Monte Carlo method to explore multi-modal distributions in a simple and fast manner. This algorithm is essentially a Metropolis-Hastings algorithm with a proposal that consists of a downhill move in density that aims to make local modes

  9. A novel Bayesian hierarchical model for road safety hotspot prediction.

    Science.gov (United States)

    Fawcett, Lee; Thorpe, Neil; Matthews, Joseph; Kremer, Karsten

    2017-02-01

    In this paper, we propose a Bayesian hierarchical model for predicting accident counts in future years at sites within a pool of potential road safety hotspots. The aim is to inform road safety practitioners of the location of likely future hotspots to enable a proactive, rather than reactive, approach to road safety scheme implementation. A feature of our model is the ability to rank sites according to their potential to exceed, in some future time period, a threshold accident count which may be used as a criterion for scheme implementation. Our model specification enables the classical empirical Bayes formulation - commonly used in before-and-after studies, wherein accident counts from a single before period are used to estimate counterfactual counts in the after period - to be extended to incorporate counts from multiple time periods. This allows site-specific variations in historical accident counts (e.g. locally-observed trends) to offset estimates of safety generated by a global accident prediction model (APM), which itself is used to help account for the effects of global trend and regression-to-mean (RTM). The Bayesian posterior predictive distribution is exploited to formulate predictions and to properly quantify our uncertainty in these predictions. The main contributions of our model include (i) the ability to allow accident counts from multiple time-points to inform predictions, with counts in more recent years lending more weight to predictions than counts from time-points further in the past; (ii) where appropriate, the ability to offset global estimates of trend by variations in accident counts observed locally, at a site-specific level; and (iii) the ability to account for unknown/unobserved site-specific factors which may affect accident counts. We illustrate our model with an application to accident counts at 734 potential hotspots in the German city of Halle; we also propose some simple diagnostics to validate the predictive capability of our

  10. Introduction to Hierarchical Bayesian Modeling for Ecological Data

    CERN Document Server

    Parent, Eric

    2012-01-01

    Making statistical modeling and inference more accessible to ecologists and related scientists, Introduction to Hierarchical Bayesian Modeling for Ecological Data gives readers a flexible and effective framework to learn about complex ecological processes from various sources of data. It also helps readers get started on building their own statistical models. The text begins with simple models that progressively become more complex and realistic through explanatory covariates and intermediate hidden states variables. When fitting the models to data, the authors gradually present the concepts a

  11. A fully Bayesian strategy for high-dimensional hierarchical modeling using massively parallel computing

    OpenAIRE

    Landau, Will; Niemi, Jarad

    2016-01-01

    Markov chain Monte Carlo (MCMC) is the predominant tool used in Bayesian parameter estimation for hierarchical models. When the model expands due to an increasing number of hierarchical levels, number of groups at a particular level, or number of observations in each group, a fully Bayesian analysis via MCMC can easily become computationally demanding, even intractable. We illustrate how the steps in an MCMC for hierarchical models are predominantly one of two types: conditionally independent...

  12. Inferring on the intentions of others by hierarchical Bayesian learning.

    Directory of Open Access Journals (Sweden)

    Andreea O Diaconescu

    2014-09-01

    Full Text Available Inferring on others' (potentially time-varying intentions is a fundamental problem during many social transactions. To investigate the underlying mechanisms, we applied computational modeling to behavioral data from an economic game in which 16 pairs of volunteers (randomly assigned to "player" or "adviser" roles interacted. The player performed a probabilistic reinforcement learning task, receiving information about a binary lottery from a visual pie chart. The adviser, who received more predictive information, issued an additional recommendation. Critically, the game was structured such that the adviser's incentives to provide helpful or misleading information varied in time. Using a meta-Bayesian modeling framework, we found that the players' behavior was best explained by the deployment of hierarchical learning: they inferred upon the volatility of the advisers' intentions in order to optimize their predictions about the validity of their advice. Beyond learning, volatility estimates also affected the trial-by-trial variability of decisions: participants were more likely to rely on their estimates of advice accuracy for making choices when they believed that the adviser's intentions were presently stable. Finally, our model of the players' inference predicted the players' interpersonal reactivity index (IRI scores, explicit ratings of the advisers' helpfulness and the advisers' self-reports on their chosen strategy. Overall, our results suggest that humans (i employ hierarchical generative models to infer on the changing intentions of others, (ii use volatility estimates to inform decision-making in social interactions, and (iii integrate estimates of advice accuracy with non-social sources of information. The Bayesian framework presented here can quantify individual differences in these mechanisms from simple behavioral readouts and may prove useful in future clinical studies of maladaptive social cognition.

  13. Bayesian Hierarchical Scale Mixtures of Log-Normal Models for Inference in Reliability with Stochastic Constraint

    Directory of Open Access Journals (Sweden)

    Hea-Jung Kim

    2017-06-01

    Full Text Available This paper develops Bayesian inference in reliability of a class of scale mixtures of log-normal failure time (SMLNFT models with stochastic (or uncertain constraint in their reliability measures. The class is comprehensive and includes existing failure time (FT models (such as log-normal, log-Cauchy, and log-logistic FT models as well as new models that are robust in terms of heavy-tailed FT observations. Since classical frequency approaches to reliability analysis based on the SMLNFT model with stochastic constraint are intractable, the Bayesian method is pursued utilizing a Markov chain Monte Carlo (MCMC sampling based approach. This paper introduces a two-stage maximum entropy (MaxEnt prior, which elicits a priori uncertain constraint and develops Bayesian hierarchical SMLNFT model by using the prior. The paper also proposes an MCMC method for Bayesian inference in the SMLNFT model reliability and calls attention to properties of the MaxEnt prior that are useful for method development. Finally, two data sets are used to illustrate how the proposed methodology works.

  14. Bayesian Uncertainty Quantification for Subsurface Inversion Using a Multiscale Hierarchical Model

    KAUST Repository

    Mondal, Anirban

    2014-07-03

    We consider a Bayesian approach to nonlinear inverse problems in which the unknown quantity is a random field (spatial or temporal). The Bayesian approach contains a natural mechanism for regularization in the form of prior information, can incorporate information from heterogeneous sources and provide a quantitative assessment of uncertainty in the inverse solution. The Bayesian setting casts the inverse solution as a posterior probability distribution over the model parameters. The Karhunen-Loeve expansion is used for dimension reduction of the random field. Furthermore, we use a hierarchical Bayes model to inject multiscale data in the modeling framework. In this Bayesian framework, we show that this inverse problem is well-posed by proving that the posterior measure is Lipschitz continuous with respect to the data in total variation norm. Computational challenges in this construction arise from the need for repeated evaluations of the forward model (e.g., in the context of MCMC) and are compounded by high dimensionality of the posterior. We develop two-stage reversible jump MCMC that has the ability to screen the bad proposals in the first inexpensive stage. Numerical results are presented by analyzing simulated as well as real data from hydrocarbon reservoir. This article has supplementary material available online. © 2014 American Statistical Association and the American Society for Quality.

  15. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Directory of Open Access Journals (Sweden)

    Guillaume Bal

    Full Text Available Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i an emotive simulated example, ii application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  16. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Science.gov (United States)

    Bal, Guillaume; Rivot, Etienne; Baglinière, Jean-Luc; White, Jonathan; Prévost, Etienne

    2014-01-01

    Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i) an emotive simulated example, ii) application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  17. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2009-01-01

    , and electrode positions. We first present a hierarchical Bayesian framework for EEG source localization that jointly performs source and forward model reconstruction (SOFOMORE). Secondly, we evaluate the SOFOMORE model by comparison with source reconstruction methods that use fixed forward models. Simulated......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....

  18. Nutrient pathways and neural tube defects: a semi-Bayesian hierarchical analysis.

    Science.gov (United States)

    Carmichael, Suzan L; Witte, John S; Shaw, Gary M

    2009-01-01

    We used conventional and hierarchical logistic regression to examine the association of neural tube defects (NTDs) with intake of 26 nutrients that contribute to the mechanistic pathways of methylation, glycemic control, and oxidative stress, all of which have been implicated in NTD etiology. The hierarchical approach produces more plausible, more stable estimates than the conventional approach, while adjusting for potential confounding by other nutrients. Analyses included 386 cases and 408 nonmalformed controls with complete data on nutrients and potential confounders (race/ethnicity, education, obesity, and intake of vitamin supplements) from a population-based case-control study of deliveries in California from 1989 to 1991. Nutrients were specified as continuous, and their units were standardized to have a mean of zero and standard deviation (SD) of 1 for comparability of units across pathways. ORs reflect a 1-SD increase in the corresponding nutrient. Among women who took vitamin supplements, semi-Bayesian hierarchical modeling results suggested no associations between nutrient intake and NTDs. Among women who did not take supplements, both conventional and hierarchical models (HM) suggested an inverse association between lutein intake and NTD risk (HM odds ratio [OR] = 0.6; 95% confidence interval = 0.5-0.9) and a positive association with sucrose (HM OR 1.4; 1.1-1.8) and glycemic index (HM OR 1.3; 1.0-1.6). Our findings for lutein, glycemic index, and sucrose suggest that further study of NTDs and the glycemic control and oxidative stress pathways is warranted.

  19. Determining the Bayesian optimal sampling strategy in a hierarchical system.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Ringland, James T.; Boggs, Paul T.; Pebay, Philippe Pierre

    2010-09-01

    Consider a classic hierarchy tree as a basic model of a 'system-of-systems' network, where each node represents a component system (which may itself consist of a set of sub-systems). For this general composite system, we present a technique for computing the optimal testing strategy, which is based on Bayesian decision analysis. In previous work, we developed a Bayesian approach for computing the distribution of the reliability of a system-of-systems structure that uses test data and prior information. This allows for the determination of both an estimate of the reliability and a quantification of confidence in the estimate. Improving the accuracy of the reliability estimate and increasing the corresponding confidence require the collection of additional data. However, testing all possible sub-systems may not be cost-effective, feasible, or even necessary to achieve an improvement in the reliability estimate. To address this sampling issue, we formulate a Bayesian methodology that systematically determines the optimal sampling strategy under specified constraints and costs that will maximally improve the reliability estimate of the composite system, e.g., by reducing the variance of the reliability distribution. This methodology involves calculating the 'Bayes risk of a decision rule' for each available sampling strategy, where risk quantifies the relative effect that each sampling strategy could have on the reliability estimate. A general numerical algorithm is developed and tested using an example multicomponent system. The results show that the procedure scales linearly with the number of components available for testing.

  20. Comparison of Bayesian and frequentist approaches

    OpenAIRE

    Ageyeva, Anna

    2010-01-01

    The thesis deals with Bayesian approach to statistics and its comparison to frequentist approach. The main aim of the thesis is to compare frequentist and Bayesian approaches to statistics by analyzing statistical inferences, examining the question of subjectivity and objectivity in statistics. Another goal of the thesis is to draw attention to the importance and necessity to teach Bayesian statistics at our University more profound. The thesis includes three chapters. The first chapter prese...

  1. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  2. A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change

    CERN Document Server

    Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...

  3. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... the Bayesian approach assigns an observed unit to a group with the greatest posterior probability. Fisher's linear discriminant analysis though is the most widely used method of classification because of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to ...

  4. A Bayesian Hierarchical Modeling Scheme for Estimating Erosion Rates Under Current Climate Conditions

    Science.gov (United States)

    Lowman, L.; Barros, A. P.

    2014-12-01

    Computational modeling of surface erosion processes is inherently difficult because of the four-dimensional nature of the problem and the multiple temporal and spatial scales that govern individual mechanisms. Landscapes are modified via surface and fluvial erosion and exhumation, each of which takes place over a range of time scales. Traditional field measurements of erosion/exhumation rates are scale dependent, often valid for a single point-wise location or averaging over large aerial extents and periods with intense and mild erosion. We present a method of remotely estimating erosion rates using a Bayesian hierarchical model based upon the stream power erosion law (SPEL). A Bayesian approach allows for estimating erosion rates using the deterministic relationship given by the SPEL and data on channel slopes and precipitation at the basin and sub-basin scale. The spatial scale associated with this framework is the elevation class, where each class is characterized by distinct morphologic behavior observed through different modes in the distribution of basin outlet elevations. Interestingly, the distributions of first-order outlets are similar in shape and extent to the distribution of precipitation events (i.e. individual storms) over a 14-year period between 1998-2011. We demonstrate an application of the Bayesian hierarchical modeling framework for five basins and one intermontane basin located in the central Andes between 5S and 20S. Using remotely sensed data of current annual precipitation rates from the Tropical Rainfall Measuring Mission (TRMM) and topography from a high resolution (3 arc-seconds) digital elevation map (DEM), our erosion rate estimates are consistent with decadal-scale estimates based on landslide mapping and sediment flux observations and 1-2 orders of magnitude larger than most millennial and million year timescale estimates from thermochronology and cosmogenic nuclides.

  5. A full-capture Hierarchical Bayesian model of Pollock's Closed Robust Design and application to dolphins

    Directory of Open Access Journals (Sweden)

    Robert William Rankin

    2016-03-01

    Full Text Available We present a Hierarchical Bayesian version of Pollock's Closed Robust Design for studying the survival, temporary-migration, and abundance of marked animals. Through simulations and analyses of a bottlenose dolphin photo-identification dataset, we compare several estimation frameworks, including Maximum Likelihood estimation (ML, model-averaging by AICc, as well as Bayesian and Hierarchical Bayesian (HB procedures. Our results demonstrate a number of advantages of the Bayesian framework over other popular methods. First, for simple fixed-effect models, we show the near-equivalence of Bayesian and ML point-estimates and confidence/credibility intervals. Second, we demonstrate how there is an inherent correlation among temporary-migration and survival parameter estimates in the PCRD, and while this can lead to serious convergence issues and singularities among MLEs, we show that the Bayesian estimates were more reliable. Third, we demonstrate that a Hierarchical Bayesian model with carefully thought-out hyperpriors, can lead to similar parameter estimates and conclusions as multi-model inference by AICc model-averaging. This latter point is especially interesting for mark-recapture practitioners, for whom model-uncertainty and multi-model inference have become a major preoccupation. Lastly, we extend the Hierarchical Bayesian PCRD to include full-capture histories (i.e., by modelling a recruitment process and individual-level heterogeneity in detection probabilities, which can have important consequences for the range of phenomena studied by the PCRD, as well as lead to large differences in abundance estimates. For example, we estimate 8%-24% more bottlenose dolphins in the western gulf of Shark Bay than previously estimated by ML and AICc-based model-averaging. Other important extensions are discussed. Our Bayesian PCRD models are written in the BUGS-like JAGS language for easy dissemination and customization by the community of capture

  6. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.; Collaboration, ALICE

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  7. Sparse Event Modeling with Hierarchical Bayesian Kernel Methods

    Science.gov (United States)

    2016-01-05

    data, is it equally important to analyze the prediction power of a statistical model if it is going to be used for forecasting purposes. Prediction...Poisson Bayesian Kernel Methods for Modeling Count Data, Computational Statistics and Data Analysis (04 2016) TOTAL: 1 Books Number of Manuscripts...factors into the assessment of a rehabilitation project. Conclusions Bayesian kernel methods are powerful tools in forecasting data. These models make

  8. Hierarchical structure of the Sicilian goats revealed by Bayesian analyses of microsatellite information.

    Science.gov (United States)

    Siwek, M; Finocchiaro, R; Curik, I; Portolano, B

    2011-02-01

    Genetic structure and relationship amongst the main goat populations in Sicily (Girgentana, Derivata di Siria, Maltese and Messinese) were analysed using information from 19 microsatellite markers genotyped on 173 individuals. A posterior Bayesian approach implemented in the program STRUCTURE revealed a hierarchical structure with two clusters at the first level (Girgentana vs. Messinese, Derivata di Siria and Maltese), explaining 4.8% of variation (amovaФ(ST) estimate). Seven clusters nested within these first two clusters (further differentiations of Girgentana, Derivata di Siria and Maltese), explaining 8.5% of variation (amovaФ(SC) estimate). The analyses and methods applied in this study indicate their power to detect subtle population structure. © 2010 The Authors, Animal Genetics © 2010 Stichting International Foundation for Animal Genetics.

  9. A Hierarchical Bayesian M/EEG Imaging Method Correcting for Incomplete Spatio-Temporal Priors

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Attias, Hagai T.; Sekihara, Kensuke

    2013-01-01

    In this paper we present a hierarchical Bayesian model, to tackle the highly ill-posed problem that follows with MEG and EEG source imaging. Our model promotes spatiotemporal patterns through the use of both spatial and temporal basis functions. While in contrast to most previous spatio-temporal ......In this paper we present a hierarchical Bayesian model, to tackle the highly ill-posed problem that follows with MEG and EEG source imaging. Our model promotes spatiotemporal patterns through the use of both spatial and temporal basis functions. While in contrast to most previous spatio...

  10. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  11. A Hierarchical Bayesian Model to Predict Self-Thinning Line for Chinese Fir in Southern China.

    Directory of Open Access Journals (Sweden)

    Xiongqing Zhang

    Full Text Available Self-thinning is a dynamic equilibrium between forest growth and mortality at full site occupancy. Parameters of the self-thinning lines are often confounded by differences across various stand and site conditions. For overcoming the problem of hierarchical and repeated measures, we used hierarchical Bayesian method to estimate the self-thinning line. The results showed that the self-thinning line for Chinese fir (Cunninghamia lanceolata (Lamb.Hook. plantations was not sensitive to the initial planting density. The uncertainty of model predictions was mostly due to within-subject variability. The simulation precision of hierarchical Bayesian method was better than that of stochastic frontier function (SFF. Hierarchical Bayesian method provided a reasonable explanation of the impact of other variables (site quality, soil type, aspect, etc. on self-thinning line, which gave us the posterior distribution of parameters of self-thinning line. The research of self-thinning relationship could be benefit from the use of hierarchical Bayesian method.

  12. DM-BLD: differential methylation detection using a hierarchical Bayesian model exploiting local dependency.

    Science.gov (United States)

    Wang, Xiao; Gu, Jinghua; Hilakivi-Clarke, Leena; Clarke, Robert; Xuan, Jianhua

    2017-01-15

    The advent of high-throughput DNA methylation profiling techniques has enabled the possibility of accurate identification of differentially methylated genes for cancer research. The large number of measured loci facilitates whole genome methylation study, yet posing great challenges for differential methylation detection due to the high variability in tumor samples. We have developed a novel probabilistic approach, D: ifferential M: ethylation detection using a hierarchical B: ayesian model exploiting L: ocal D: ependency (DM-BLD), to detect differentially methylated genes based on a Bayesian framework. The DM-BLD approach features a joint model to capture both the local dependency of measured loci and the dependency of methylation change in samples. Specifically, the local dependency is modeled by Leroux conditional autoregressive structure; the dependency of methylation changes is modeled by a discrete Markov random field. A hierarchical Bayesian model is developed to fully take into account the local dependency for differential analysis, in which differential states are embedded as hidden variables. Simulation studies demonstrate that DM-BLD outperforms existing methods for differential methylation detection, particularly when the methylation change is moderate and the variability of methylation in samples is high. DM-BLD has been applied to breast cancer data to identify important methylated genes (such as polycomb target genes and genes involved in transcription factor activity) associated with breast cancer recurrence. A Matlab package of DM-BLD is available at http://www.cbil.ece.vt.edu/software.htm CONTACT: Xuan@vt.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    .... In this approach, the source and target ontologies are first translated into Bayesian networks (BN); the concept mapping between the two ontologies are treated as evidential reasoning between the two translated BNs...

  14. A Hierarchical Bayesian Setting for an Inverse Problem in Linear Parabolic PDEs with Noisy Boundary Conditions

    KAUST Repository

    Ruggeri, Fabrizio

    2016-05-12

    In this work we develop a Bayesian setting to infer unknown parameters in initial-boundary value problems related to linear parabolic partial differential equations. We realistically assume that the boundary data are noisy, for a given prescribed initial condition. We show how to derive the joint likelihood function for the forward problem, given some measurements of the solution field subject to Gaussian noise. Given Gaussian priors for the time-dependent Dirichlet boundary values, we analytically marginalize the joint likelihood using the linearity of the equation. Our hierarchical Bayesian approach is fully implemented in an example that involves the heat equation. In this example, the thermal diffusivity is the unknown parameter. We assume that the thermal diffusivity parameter can be modeled a priori through a lognormal random variable or by means of a space-dependent stationary lognormal random field. Synthetic data are used to test the inference. We exploit the behavior of the non-normalized log posterior distribution of the thermal diffusivity. Then, we use the Laplace method to obtain an approximated Gaussian posterior and therefore avoid costly Markov Chain Monte Carlo computations. Expected information gains and predictive posterior densities for observable quantities are numerically estimated using Laplace approximation for different experimental setups.

  15. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  16. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  17. Hierarchical Bayesian Markov switching models with application to predicting spawning success of shovelnose sturgeon

    Science.gov (United States)

    Holan, S.H.; Davis, G.M.; Wildhaber, M.L.; DeLonay, A.J.; Papoulias, D.M.

    2009-01-01

    The timing of spawning in fish is tightly linked to environmental factors; however, these factors are not very well understood for many species. Specifically, little information is available to guide recruitment efforts for endangered species such as the sturgeon. Therefore, we propose a Bayesian hierarchical model for predicting the success of spawning of the shovelnose sturgeon which uses both biological and behavioural (longitudinal) data. In particular, we use data that were produced from a tracking study that was conducted in the Lower Missouri River. The data that were produced from this study consist of biological variables associated with readiness to spawn along with longitudinal behavioural data collected by using telemetry and archival data storage tags. These high frequency data are complex both biologically and in the underlying behavioural process. To accommodate such complexity we developed a hierarchical linear regression model that uses an eigenvalue predictor, derived from the transition probability matrix of a two-state Markov switching model with generalized auto-regressive conditional heteroscedastic dynamics. Finally, to minimize the computational burden that is associated with estimation of this model, a parallel computing approach is proposed. ?? Journal compilation 2009 Royal Statistical Society.

  18. Using time-varying asymptotic length and body condition of top piscivores to indicate ecosystem regime shift in the main basin of Lake Huron: a Bayesian hierarchical modeling approach

    Science.gov (United States)

    He, Ji X.; Bence, James R.; Roseman, Edward F.; Fielder, David G.; Ebener, Mark P.

    2015-01-01

    We evaluated the ecosystem regime shift in the main basin of Lake Huron that was indicated by the 2003 collapse of alewives, and dramatic declines in Chinook salmon abundance thereafter. We found that the period of 1995-2002 should be considered as the early phase of the final regime shift. We developed two Bayesian hierarchical models to describe time-varying growth based on the von Bertalanffy growth function and the length-mass relationship. We used asymptotic length as an index of growth potential, and predicted body mass at a given length as an index of body condition. Modeling fits to length and body mass at age of lake trout, Chinook salmon, and walleye were excellent. Based on posterior distributions, we evaluated the shifts in among-year geometric means of the growth potential and body condition. For a given top piscivore, one of the two indices responded to the regime shift much earlier than the 2003 collapse of alewives, the other corresponded to the 2003 changes, and which index provided the early signal differed among the three top piscivores.

  19. Application of hierarchical Bayesian unmixing models in river sediment source apportionment

    Science.gov (United States)

    Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice

    2016-04-01

    Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling

  20. Use of a Bayesian hierarchical model to study the allometric scaling of the fetoplacental weight ratio

    Directory of Open Access Journals (Sweden)

    Fidel Ernesto Castro Morales

    2016-03-01

    Full Text Available Abstract Objectives: to propose the use of a Bayesian hierarchical model to study the allometric scaling of the fetoplacental weight ratio, including possible confounders. Methods: data from 26 singleton pregnancies with gestational age at birth between 37 and 42 weeks were analyzed. The placentas were collected immediately after delivery and stored under refrigeration until the time of analysis, which occurred within up to 12 hours. Maternal data were collected from medical records. A Bayesian hierarchical model was proposed and Markov chain Monte Carlo simulation methods were used to obtain samples from distribution a posteriori. Results: the model developed showed a reasonable fit, even allowing for the incorporation of variables and a priori information on the parameters used. Conclusions: new variables can be added to the modelfrom the available code, allowing many possibilities for data analysis and indicating the potential for use in research on the subject.

  1. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. I. Biological model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    Several replacement models have been presented in literature. In other applicational areas like dairy cow replacement, various methodological improvements like hierarchical Markov processes and Bayesian updating have been implemented, but not in sow models. Furthermore, there are methodological i...

  2. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with ...

  3. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    MS received 8 October 2007; revised 15 July 2008. Abstract. This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become ...

  4. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.

    2011-01-01

    techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  5. Estimating mono- and bi-phasic regression parameters using a mixture piecewise linear Bayesian hierarchical model.

    Science.gov (United States)

    Zhao, Rui; Catalano, Paul; DeGruttola, Victor G; Michor, Franziska

    2017-01-01

    The dynamics of tumor burden, secreted proteins or other biomarkers over time, is often used to evaluate the effectiveness of therapy and to predict outcomes for patients. Many methods have been proposed to investigate longitudinal trends to better characterize patients and to understand disease progression. However, most approaches assume a homogeneous patient population and a uniform response trajectory over time and across patients. Here, we present a mixture piecewise linear Bayesian hierarchical model, which takes into account both population heterogeneity and nonlinear relationships between biomarkers and time. Simulation results show that our method was able to classify subjects according to their patterns of treatment response with greater than 80% accuracy in the three scenarios tested. We then applied our model to a large randomized controlled phase III clinical trial of multiple myeloma patients. Analysis results suggest that the longitudinal tumor burden trajectories in multiple myeloma patients are heterogeneous and nonlinear, even among patients assigned to the same treatment cohort. In addition, between cohorts, there are distinct differences in terms of the regression parameters and the distributions among categories in the mixture. Those results imply that longitudinal data from clinical trials may harbor unobserved subgroups and nonlinear relationships; accounting for both may be important for analyzing longitudinal data.

  6. Hierarchical Bayesian models to assess between- and within-batch variability of pathogen contamination in food.

    Science.gov (United States)

    Commeau, Natalie; Cornu, Marie; Albert, Isabelle; Denis, Jean-Baptiste; Parent, Eric

    2012-03-01

    Assessing within-batch and between-batch variability is of major interest for risk assessors and risk managers in the context of microbiological contamination of food. For example, the ratio between the within-batch variability and the between-batch variability has a large impact on the results of a sampling plan. Here, we designed hierarchical Bayesian models to represent such variability. Compatible priors were built mathematically to obtain sound model comparisons. A numeric criterion is proposed to assess the contamination structure comparing the ability of the models to replicate grouped data at the batch level using a posterior predictive loss approach. Models were applied to two case studies: contamination by Listeria monocytogenes of pork breast used to produce diced bacon and contamination by the same microorganism on cold smoked salmon at the end of the process. In the first case study, a contamination structure clearly exists and is located at the batch level, that is, between batches variability is relatively strong, whereas in the second a structure also exists but is less marked. © 2012 Society for Risk Analysis.

  7. Modeling inter-subject variability in fMRI activation location: A Bayesian hierarchical spatial model

    Science.gov (United States)

    Xu, Lei; Johnson, Timothy D.; Nichols, Thomas E.; Nee, Derek E.

    2010-01-01

    Summary The aim of this work is to develop a spatial model for multi-subject fMRI data. There has been extensive work on univariate modeling of each voxel for single and multi-subject data, some work on spatial modeling of single-subject data, and some recent work on spatial modeling of multi-subject data. However, there has been no work on spatial models that explicitly account for inter-subject variability in activation locations. In this work, we use the idea of activation centers and model the inter-subject variability in activation locations directly. Our model is specified in a Bayesian hierarchical frame work which allows us to draw inferences at all levels: the population level, the individual level and the voxel level. We use Gaussian mixtures for the probability that an individual has a particular activation. This helps answer an important question which is not addressed by any of the previous methods: What proportion of subjects had a significant activity in a given region. Our approach incorporates the unknown number of mixture components into the model as a parameter whose posterior distribution is estimated by reversible jump Markov Chain Monte Carlo. We demonstrate our method with a fMRI study of resolving proactive interference and show dramatically better precision of localization with our method relative to the standard mass-univariate method. Although we are motivated by fMRI data, this model could easily be modified to handle other types of imaging data. PMID:19210732

  8. An economic growth model based on financial credits distribution to the government economy priority sectors of each regency in Indonesia using hierarchical Bayesian method

    Science.gov (United States)

    Yasmirullah, Septia Devi Prihastuti; Iriawan, Nur; Sipayung, Feronika Rosalinda

    2017-11-01

    The success of regional economic establishment could be measured by economic growth. Since the Act No. 32 of 2004 has been implemented, unbalance economic among the regency in Indonesia is increasing. This condition is contrary different with the government goal to build society welfare through the economic activity development in each region. This research aims to examine economic growth through the distribution of bank credits to each Indonesia's regency. The data analyzed in this research is hierarchically structured data which follow normal distribution in first level. Two modeling approaches are employed in this research, a global-one level Bayesian approach and two-level hierarchical Bayesian approach. The result shows that hierarchical Bayesian has succeeded to demonstrate a better estimation than a global-one level Bayesian. It proves that the different economic growth in each province is significantly influenced by the variations of micro level characteristics in each province. These variations are significantly affected by cities and province characteristics in second level.

  9. Bayesian statistical approaches to evaluating cognitive models.

    Science.gov (United States)

    Annis, Jeffrey; Palmeri, Thomas J

    2017-11-28

    Cognitive models aim to explain complex human behavior in terms of hypothesized mechanisms of the mind. These mechanisms can be formalized in terms of mathematical structures containing parameters that are theoretically meaningful. For example, in the case of perceptual decision making, model parameters might correspond to theoretical constructs like response bias, evidence quality, response caution, and the like. Formal cognitive models go beyond verbal models in that cognitive mechanisms are instantiated in terms of mathematics and they go beyond statistical models in that cognitive model parameters are psychologically interpretable. We explore three key elements used to formally evaluate cognitive models: parameter estimation, model prediction, and model selection. We compare and contrast traditional approaches with Bayesian statistical approaches to performing each of these three elements. Traditional approaches rely on an array of seemingly ad hoc techniques, whereas Bayesian statistical approaches rely on a single, principled, internally consistent system. We illustrate the Bayesian statistical approach to evaluating cognitive models using a running example of the Linear Ballistic Accumulator model of decision making (Brown SD, Heathcote A. The simplest complete model of choice response time: linear ballistic accumulation. Cogn Psychol 2008, 57:153-178). This article is categorized under: Neuroscience > Computation Psychology > Reasoning and Decision Making Psychology > Theory and Methods. © 2017 Wiley Periodicals, Inc.

  10. Comparison of the Bayesian and Frequentist Approach to the Statistics

    OpenAIRE

    Hakala, Michal

    2015-01-01

    The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...

  11. Risk Assessment for Mobile Systems Through a Multilayered Hierarchical Bayesian Network.

    Science.gov (United States)

    Li, Shancang; Tryfonas, Theo; Russell, Gordon; Andriotis, Panagiotis

    2016-08-01

    Mobile systems are facing a number of application vulnerabilities that can be combined together and utilized to penetrate systems with devastating impact. When assessing the overall security of a mobile system, it is important to assess the security risks posed by each mobile applications (apps), thus gaining a stronger understanding of any vulnerabilities present. This paper aims at developing a three-layer framework that assesses the potential risks which apps introduce within the Android mobile systems. A Bayesian risk graphical model is proposed to evaluate risk propagation in a layered risk architecture. By integrating static analysis, dynamic analysis, and behavior analysis in a hierarchical framework, the risks and their propagation through each layer are well modeled by the Bayesian risk graph, which can quantitatively analyze risks faced to both apps and mobile systems. The proposed hierarchical Bayesian risk graph model offers a novel way to investigate the security risks in mobile environment and enables users and administrators to evaluate the potential risks. This strategy allows to strengthen both app security as well as the security of the entire system.

  12. Particle identification in ALICE: a Bayesian approach

    CERN Document Server

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Diozcora Vargas Trevino, Aurora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym

    2016-05-25

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  13. Bayesian and variational Bayesian approaches for flows in heterogeneous random media

    Science.gov (United States)

    Yang, Keren; Guha, Nilabja; Efendiev, Yalchin; Mallick, Bani K.

    2017-09-01

    In this paper, we study porous media flows in heterogeneous stochastic media. We propose an efficient forward simulation technique that is tailored for variational Bayesian inversion. As a starting point, the proposed forward simulation technique decomposes the solution into the sum of separable functions (with respect to randomness and the space), where each term is calculated based on a variational approach. This is similar to Proper Generalized Decomposition (PGD). Next, we apply a multiscale technique to solve for each term (as in [1]) and, further, decompose the random function into 1D fields. As a result, our proposed method provides an approximation hierarchy for the solution as we increase the number of terms in the expansion and, also, increase the spatial resolution of each term. We use the hierarchical solution distributions in a variational Bayesian approximation to perform uncertainty quantification in the inverse problem. We conduct a detailed numerical study to explore the performance of the proposed uncertainty quantification technique and show the theoretical posterior concentration.

  14. HDDM: Hierarchical Bayesian estimation of the Drift-Diffusion Model in Python

    Directory of Open Access Journals (Sweden)

    Thomas V Wiecki

    2013-08-01

    Full Text Available The diffusion model is a commonly used tool to infer latent psychological processes underlying decision making, and to link them to neural mechanisms based on reaction times. Although efficient open source software has been made available to quantitatively fit the model to data, current estimation methods require an abundance of reaction time measurements to recover meaningful parameters, and only provide point estimates of each parameter. In contrast, hierarchical Bayesian parameter estimation methods are useful for enhancing statistical power, allowing for simultaneous estimation of individual subject parameters and the group distribution that they are drawn from, while also providing measures of uncertainty in these parameters in the posterior distribution. Here, we present a novel Python-based toolbox called HDDM (hierarchical drift diffusion model, which allows fast and flexible estimation of the the drift-diffusion model and the related linear ballistic accumulator model. HDDM requires fewer data per subject / condition than non-hierarchical method, allows for full Bayesian data analysis, and can handle outliers in the data. Finally, HDDM supports the estimation of how trial-by-trial measurements (e.g. fMRI influence decision making parameters. This paper will first describe the theoretical background of drift-diffusion model and Bayesian inference. We then illustrate usage of the toolbox on a real-world data set from our lab. Finally, parameter recovery studies show that HDDM beats alternative fitting methods like the chi-quantile method as well as maximum likelihood estimation. The software and documentation can be downloaded at: http://ski.clps.brown.edu/hddm_docs

  15. Top-down feedback in an HMAX-like cortical model of object perception based on hierarchical Bayesian networks and belief propagation

    National Research Council Canada - National Science Library

    Dura-Bernal, Salvador; Wennekers, Thomas; Denham, Susan L

    2012-01-01

    Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward...

  16. Top-Down Feedback in an HMAX-Like Cortical Model of Object Perception Based on Hierarchical Bayesian Networks and Belief Propagation: e48216

    National Research Council Canada - National Science Library

    Salvador Dura-Bernal; Thomas Wennekers; Susan L Denham

    2012-01-01

      Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward...

  17. Inter-rater reliability of pressure ulcer staging: ordinal probit Bayesian hierarchical model that allows for uncertain rater response.

    Science.gov (United States)

    Gajewski, Byron J; Hart, Sara; Bergquist-Beringer, Sandra; Dunton, Nancy

    2007-11-10

    This article describes a method for estimating the inter-rater reliability of pressure ulcer (PU) staging (stages I-IV) from raters in National Database of Nursing Quality Indicators (NDNQI) participating hospitals. The method models ordinal spanning data utilizing an ordinal probit Bayesian hierarchical model (BHM) across several hospitals in which raters monitor patient's PUs. An ulcer that cannot be accurately assessed because the base of the wound cannot be seen is defined as unstageable. Our novel approach allows for an unstageable PU rating to be included in the analysis. We compare the ordinal probit BHM to an approximate random-effects (standard approach in the literature) model that assumes that the raw ordinal data are continuous. Copyright 2007 John Wiley & Sons, Ltd.

  18. Bayesian approach to avoiding track seduction

    Science.gov (United States)

    Salmond, David J.; Everett, Nicholas O.

    2002-08-01

    The problem of maintaining track on a primary target in the presence spurious objects is addressed. Recursive and batch filtering approaches are developed. For the recursive approach, a Bayesian track splitting filter is derived which spawns candidate tracks if there is a possibility of measurement misassociation. The filter evaluates the probability of each candidate track being associated with the primary target. The batch filter is a Markov-chain Monte Carlo (MCMC) algorithm which fits the observed data sequence to models of target dynamics and measurement-track association. Simulation results are presented.

  19. Prion Amplification and Hierarchical Bayesian Modeling Refine Detection of Prion Infection

    Science.gov (United States)

    Wyckoff, A. Christy; Galloway, Nathan; Meyerett-Reid, Crystal; Powers, Jenny; Spraker, Terry; Monello, Ryan J.; Pulford, Bruce; Wild, Margaret; Antolin, Michael; Vercauteren, Kurt; Zabel, Mark

    2015-02-01

    Prions are unique infectious agents that replicate without a genome and cause neurodegenerative diseases that include chronic wasting disease (CWD) of cervids. Immunohistochemistry (IHC) is currently considered the gold standard for diagnosis of a prion infection but may be insensitive to early or sub-clinical CWD that are important to understanding CWD transmission and ecology. We assessed the potential of serial protein misfolding cyclic amplification (sPMCA) to improve detection of CWD prior to the onset of clinical signs. We analyzed tissue samples from free-ranging Rocky Mountain elk (Cervus elaphus nelsoni) and used hierarchical Bayesian analysis to estimate the specificity and sensitivity of IHC and sPMCA conditional on simultaneously estimated disease states. Sensitivity estimates were higher for sPMCA (99.51%, credible interval (CI) 97.15-100%) than IHC of obex (brain stem, 76.56%, CI 57.00-91.46%) or retropharyngeal lymph node (90.06%, CI 74.13-98.70%) tissues, or both (98.99%, CI 90.01-100%). Our hierarchical Bayesian model predicts the prevalence of prion infection in this elk population to be 18.90% (CI 15.50-32.72%), compared to previous estimates of 12.90%. Our data reveal a previously unidentified sub-clinical prion-positive portion of the elk population that could represent silent carriers capable of significantly impacting CWD ecology.

  20. A Hierarchical Structure of Classification based on Trainable Bayesian Classifier for Logo Detection and Recognition in Document Image A Hierarchical Structure of Classification based on Trainable Bayesian Classifier for Logo Detection and Recognition in Document Image

    OpenAIRE

    Hossein Pourghassem

    2010-01-01

    The ever-increasing number of logo (trademark) in official automation systems for information management, archiving and retrieval application has created greater demand for an automatic detection and recognition logo. In this paper, a classification hierarchical structure based on Bayesian classifier is proposed to logo detection and recognition. In this hierarchical structure, using two measures false accept and false reject, a novel and straightforward training scheme is presented to extrac...

  1. A hierarchical Bayesian GEV model for improving local and regional flood quantile estimates

    Science.gov (United States)

    Lima, Carlos H. R.; Lall, Upmanu; Troy, Tara; Devineni, Naresh

    2016-10-01

    We estimate local and regional Generalized Extreme Value (GEV) distribution parameters for flood frequency analysis in a multilevel, hierarchical Bayesian framework, to explicitly model and reduce uncertainties. As prior information for the model, we assume that the GEV location and scale parameters for each site come from independent log-normal distributions, whose mean parameter scales with the drainage area. From empirical and theoretical arguments, the shape parameter for each site is shrunk towards a common mean. Non-informative prior distributions are assumed for the hyperparameters and the MCMC method is used to sample from the joint posterior distribution. The model is tested using annual maximum series from 20 streamflow gauges located in an 83,000 km2 flood prone basin in Southeast Brazil. The results show a significant reduction of uncertainty estimates of flood quantile estimates over the traditional GEV model, particularly for sites with shorter records. For return periods within the range of the data (around 50 years), the Bayesian credible intervals for the flood quantiles tend to be narrower than the classical confidence limits based on the delta method. As the return period increases beyond the range of the data, the confidence limits from the delta method become unreliable and the Bayesian credible intervals provide a way to estimate satisfactory confidence bands for the flood quantiles considering parameter uncertainties and regional information. In order to evaluate the applicability of the proposed hierarchical Bayesian model for regional flood frequency analysis, we estimate flood quantiles for three randomly chosen out-of-sample sites and compare with classical estimates using the index flood method. The posterior distributions of the scaling law coefficients are used to define the predictive distributions of the GEV location and scale parameters for the out-of-sample sites given only their drainage areas and the posterior distribution of the

  2. Probabilistic daily ILI syndromic surveillance with a spatio-temporal Bayesian hierarchical model.

    Directory of Open Access Journals (Sweden)

    Ta-Chien Chan

    Full Text Available BACKGROUND: For daily syndromic surveillance to be effective, an efficient and sensible algorithm would be expected to detect aberrations in influenza illness, and alert public health workers prior to any impending epidemic. This detection or alert surely contains uncertainty, and thus should be evaluated with a proper probabilistic measure. However, traditional monitoring mechanisms simply provide a binary alert, failing to adequately address this uncertainty. METHODS AND FINDINGS: Based on the Bayesian posterior probability of influenza-like illness (ILI visits, the intensity of outbreak can be directly assessed. The numbers of daily emergency room ILI visits at five community hospitals in Taipei City during 2006-2007 were collected and fitted with a Bayesian hierarchical model containing meteorological factors such as temperature and vapor pressure, spatial interaction with conditional autoregressive structure, weekend and holiday effects, seasonality factors, and previous ILI visits. The proposed algorithm recommends an alert for action if the posterior probability is larger than 70%. External data from January to February of 2008 were retained for validation. The decision rule detects successfully the peak in the validation period. When comparing the posterior probability evaluation with the modified Cusum method, results show that the proposed method is able to detect the signals 1-2 days prior to the rise of ILI visits. CONCLUSIONS: This Bayesian hierarchical model not only constitutes a dynamic surveillance system but also constructs a stochastic evaluation of the need to call for alert. The monitoring mechanism provides earlier detection as well as a complementary tool for current surveillance programs.

  3. Using Bayesian hierarchical models to better understand nitrate sources and sinks in agricultural watersheds.

    Science.gov (United States)

    Xia, Yongqiu; Weller, Donald E; Williams, Meghan N; Jordan, Thomas E; Yan, Xiaoyuan

    2016-11-15

    Export coefficient models (ECMs) are often used to predict nutrient sources and sinks in watersheds because ECMs can flexibly incorporate processes and have minimal data requirements. However, ECMs do not quantify uncertainties in model structure, parameters, or predictions; nor do they account for spatial and temporal variability in land characteristics, weather, and management practices. We applied Bayesian hierarchical methods to address these problems in ECMs used to predict nitrate concentration in streams. We compared four model formulations, a basic ECM and three models with additional terms to represent competing hypotheses about the sources of error in ECMs and about spatial and temporal variability of coefficients: an ADditive Error Model (ADEM), a SpatioTemporal Parameter Model (STPM), and a Dynamic Parameter Model (DPM). The DPM incorporates a first-order random walk to represent spatial correlation among parameters and a dynamic linear model to accommodate temporal correlation. We tested the modeling approach in a proof of concept using watershed characteristics and nitrate export measurements from watersheds in the Coastal Plain physiographic province of the Chesapeake Bay drainage. Among the four models, the DPM was the best--it had the lowest mean error, explained the most variability (R(2) = 0.99), had the narrowest prediction intervals, and provided the most effective tradeoff between fit complexity (its deviance information criterion, DIC, was 45.6 units lower than any other model, indicating overwhelming support for the DPM). The superiority of the DPM supports its underlying hypothesis that the main source of error in ECMs is their failure to account for parameter variability rather than structural error. Analysis of the fitted DPM coefficients for cropland export and instream retention revealed some of the factors controlling nitrate concentration: cropland nitrate exports were positively related to stream flow and watershed average slope

  4. A Bayesian meta-analytic approach for safety signal detection in randomized clinical trials.

    Science.gov (United States)

    Odani, Motoi; Fukimbara, Satoru; Sato, Tosiya

    2017-04-01

    Meta-analyses are frequently performed on adverse event data and are primarily used for improving statistical power to detect safety signals. However, in the evaluation of drug safety for New Drug Applications, simple pooling of adverse event data from multiple clinical trials is still commonly used. We sought to propose a new Bayesian hierarchical meta-analytic approach based on consideration of a hierarchical structure of reported individual adverse event data from multiple randomized clinical trials. To develop our meta-analysis model, we extended an existing three-stage Bayesian hierarchical model by including an additional stage of the clinical trial level in the hierarchical model; this generated a four-stage Bayesian hierarchical model. We applied the proposed Bayesian meta-analysis models to published adverse event data from three premarketing randomized clinical trials of tadalafil and to a simulation study motivated by the case example to evaluate the characteristics of three alternative models. Comparison of the results from the Bayesian meta-analysis model with those from Fisher's exact test after simple pooling showed that 6 out of 10 adverse events were the same within a top 10 ranking of individual adverse events with regard to association with treatment. However, more individual adverse events were detected in the Bayesian meta-analysis model than in Fisher's exact test under the body system "Musculoskeletal and connective tissue disorders." Moreover, comparison of the overall trend of estimates between the Bayesian model and the standard approach (odds ratios after simple pooling methods) revealed that the posterior median odds ratios for the Bayesian model for most adverse events shrank toward values for no association. Based on the simulation results, the Bayesian meta-analysis model could balance the false detection rate and power to a better extent than Fisher's exact test. For example, when the threshold value of the posterior probability for

  5. Estimating temporal trend in the presence of spatial complexity: a Bayesian hierarchical model for a wetland plant population undergoing restoration.

    Directory of Open Access Journals (Sweden)

    Thomas J Rodhouse

    Full Text Available Monitoring programs that evaluate restoration and inform adaptive management are important for addressing environmental degradation. These efforts may be well served by spatially explicit hierarchical approaches to modeling because of unavoidable spatial structure inherited from past land use patterns and other factors. We developed bayesian hierarchical models to estimate trends from annual density counts observed in a spatially structured wetland forb (Camassia quamash [camas] population following the cessation of grazing and mowing on the study area, and in a separate reference population of camas. The restoration site was bisected by roads and drainage ditches, resulting in distinct subpopulations ("zones" with different land use histories. We modeled this spatial structure by fitting zone-specific intercepts and slopes. We allowed spatial covariance parameters in the model to vary by zone, as in stratified kriging, accommodating anisotropy and improving computation and biological interpretation. Trend estimates provided evidence of a positive effect of passive restoration, and the strength of evidence was influenced by the amount of spatial structure in the model. Allowing trends to vary among zones and accounting for topographic heterogeneity increased precision of trend estimates. Accounting for spatial autocorrelation shifted parameter coefficients in ways that varied among zones depending on strength of statistical shrinkage, autocorrelation and topographic heterogeneity--a phenomenon not widely described. Spatially explicit estimates of trend from hierarchical models will generally be more useful to land managers than pooled regional estimates and provide more realistic assessments of uncertainty. The ability to grapple with historical contingency is an appealing benefit of this approach.

  6. A sow replacement model using Bayesian updating in a three-level hierarchic Markov process. II. Optimization model

    DEFF Research Database (Denmark)

    Kristensen, Anders Ringgaard; Søllested, Thomas Algot

    2004-01-01

    Recent methodological improvements in replacement models comprising multi-level hierarchical Markov processes and Bayesian updating have hardly been implemented in any replacement model and the aim of this study is to present a sow replacement model that really uses these methodological improveme...

  7. Use of hierarchical Bayesian framework in MTS studies to model different causes and novel possible forms of acquired MTS.

    Science.gov (United States)

    Ognibene, Dimitri; Giglia, Giuseppe

    2015-01-01

    An integrative account of MTS could be cast in terms of hierarchical Bayesian inference. It may help to highlight a central role of sensory (tactile) precision could play in MTS. We suggest that anosognosic patients, with anesthetic hemisoma, can also be interpreted as a form of acquired MTS, providing additional data for the model.

  8. Hierarchical Bayesian spatial models for alcohol availability, drug "hot spots" and violent crime

    Directory of Open Access Journals (Sweden)

    Horel Scott

    2006-12-01

    Full Text Available Abstract Background Ecologic studies have shown a relationship between alcohol outlet densities, illicit drug use and violence. The present study examined this relationship in the City of Houston, Texas, using a sample of 439 census tracts. Neighborhood sociostructural covariates, alcohol outlet density, drug crime density and violent crime data were collected for the year 2000, and analyzed using hierarchical Bayesian models. Model selection was accomplished by applying the Deviance Information Criterion. Results The counts of violent crime in each census tract were modelled as having a conditional Poisson distribution. Four neighbourhood explanatory variables were identified using principal component analysis. The best fitted model was selected as the one considering both unstructured and spatial dependence random effects. The results showed that drug-law violation explained a greater amount of variance in violent crime rates than alcohol outlet densities. The relative risk for drug-law violation was 2.49 and that for alcohol outlet density was 1.16. Of the neighbourhood sociostructural covariates, males of age 15 to 24 showed an effect on violence, with a 16% decrease in relative risk for each increase the size of its standard deviation. Both unstructured heterogeneity random effect and spatial dependence need to be included in the model. Conclusion The analysis presented suggests that activity around illicit drug markets is more strongly associated with violent crime than is alcohol outlet density. Unique among the ecological studies in this field, the present study not only shows the direction and magnitude of impact of neighbourhood sociostructural covariates as well as alcohol and illicit drug activities in a neighbourhood, it also reveals the importance of applying hierarchical Bayesian models in this research field as both spatial dependence and heterogeneity random effects need to be considered simultaneously.

  9. Hierarchical approaches to analysis of natural textures

    Science.gov (United States)

    Lutsiv, Vadim R.; Malyshev, Igor A.; Novikova, Tatiana A.

    2004-09-01

    The surface textures of natural objects often have the visible fractal-like properties. A similar pattern of texture could be found looking at the forests in the aerial photographs or at the trees in the outdoor scenes when the image spatial resolution was changed. Or the texture patterns are different at different spatial resolution levels in the aerial photographs of villages. It creates the difficulties in image segmentation and object recognition because the levels of spatial resolution necessary to get the homogeneously and correctly labeled texture regions differ for different types of landscape. E.g. if the spatial resolution was sufficient for distinguishing between the textures of agricultural fields, water, and asphalt, the texture labeled areas of forest or suburbs are hardly fragmented, because the texture peculiarities corresponding to two stable levels of texture spatial resolution will be visible in this case. A hierarchical texture analysis could solve this problem, and we did it in two different ways: we performed the texture segmentation simultaneously for several levels of image spatial resolution, or we subjected the texture labeled image of highest spatial resolution to a recurring texture segmentation using the texture cells of larger sizes. The both approaches turned out to be rather fruitful for the aerial photographs as well as for the outdoor images. They generalize and support the hierarchical image analysis technique presented in another our paper. Some of the methods applied were borrowed from the living vision systems.

  10. Bayesian hierarchical models for cost-effectiveness analyses that use data from cluster randomized trials.

    Science.gov (United States)

    Grieve, Richard; Nixon, Richard; Thompson, Simon G

    2010-01-01

    Cost-effectiveness analyses (CEA) may be undertaken alongside cluster randomized trials (CRTs) where randomization is at the level of the cluster (for example, the hospital or primary care provider) rather than the individual. Costs (and outcomes) within clusters may be correlated so that the assumption made by standard bivariate regression models, that observations are independent, is incorrect. This study develops a flexible modeling framework to acknowledge the clustering in CEA that use CRTs. The authors extend previous Bayesian bivariate models for CEA of multicenter trials to recognize the specific form of clustering in CRTs. They develop new Bayesian hierarchical models (BHMs) that allow mean costs and outcomes, and also variances, to differ across clusters. They illustrate how each model can be applied using data from a large (1732 cases, 70 primary care providers) CRT evaluating alternative interventions for reducing postnatal depression. The analyses compare cost-effectiveness estimates from BHMs with standard bivariate regression models that ignore the data hierarchy. The BHMs show high levels of cost heterogeneity across clusters (intracluster correlation coefficient, 0.17). Compared with standard regression models, the BHMs yield substantially increased uncertainty surrounding the cost-effectiveness estimates, and altered point estimates. The authors conclude that ignoring clustering can lead to incorrect inferences. The BHMs that they present offer a flexible modeling framework that can be applied more generally to CEA that use CRTs.

  11. A Bayesian hierarchical model for discrete choice data in health care.

    Science.gov (United States)

    Antonio, Anna Liza M; Weiss, Robert E; Saigal, Christopher S; Dahan, Ely; Crespi, Catherine M

    2017-01-01

    In discrete choice experiments, patients are presented with sets of health states described by various attributes and asked to make choices from among them. Discrete choice experiments allow health care researchers to study the preferences of individual patients by eliciting trade-offs between different aspects of health-related quality of life. However, many discrete choice experiments yield data with incomplete ranking information and sparsity due to the limited number of choice sets presented to each patient, making it challenging to estimate patient preferences. Moreover, methods to identify outliers in discrete choice data are lacking. We develop a Bayesian hierarchical random effects rank-ordered multinomial logit model for discrete choice data. Missing ranks are accounted for by marginalizing over all possible permutations of unranked alternatives to estimate individual patient preferences, which are modeled as a function of patient covariates. We provide a Bayesian version of relative attribute importance, and adapt the use of the conditional predictive ordinate to identify outlying choice sets and outlying individuals with unusual preferences compared to the population. The model is applied to data from a study using a discrete choice experiment to estimate individual patient preferences for health states related to prostate cancer treatment.

  12. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  13. Top-down feedback in an HMAX-like cortical model of object perception based on hierarchical Bayesian networks and belief propagation.

    Directory of Open Access Journals (Sweden)

    Salvador Dura-Bernal

    Full Text Available Hierarchical generative models, such as Bayesian networks, and belief propagation have been shown to provide a theoretical framework that can account for perceptual processes, including feedforward recognition and feedback modulation. The framework explains both psychophysical and physiological experimental data and maps well onto the hierarchical distributed cortical anatomy. However, the complexity required to model cortical processes makes inference, even using approximate methods, very computationally expensive. Thus, existing object perception models based on this approach are typically limited to tree-structured networks with no loops, use small toy examples or fail to account for certain perceptual aspects such as invariance to transformations or feedback reconstruction. In this study we develop a Bayesian network with an architecture similar to that of HMAX, a biologically-inspired hierarchical model of object recognition, and use loopy belief propagation to approximate the model operations (selectivity and invariance. Crucially, the resulting Bayesian network extends the functionality of HMAX by including top-down recursive feedback. Thus, the proposed model not only achieves successful feedforward recognition invariant to noise, occlusions, and changes in position and size, but is also able to reproduce modulatory effects such as illusory contour completion and attention. Our novel and rigorous methodology covers key aspects such as learning using a layerwise greedy algorithm, combining feedback information from multiple parents and reducing the number of operations required. Overall, this work extends an established model of object recognition to include high-level feedback modulation, based on state-of-the-art probabilistic approaches. The methodology employed, consistent with evidence from the visual cortex, can be potentially generalized to build models of hierarchical perceptual organization that include top-down and bottom

  14. Evaluation of image registration spatial accuracy using a Bayesian hierarchical model.

    Science.gov (United States)

    Liu, Suyu; Yuan, Ying; Castillo, Richard; Guerrero, Thomas; Johnson, Valen E

    2014-06-01

    To evaluate the utility of automated deformable image registration (DIR) algorithms, it is necessary to evaluate both the registration accuracy of the DIR algorithm itself, as well as the registration accuracy of the human readers from whom the "gold standard" is obtained. We propose a Bayesian hierarchical model to evaluate the spatial accuracy of human readers and automatic DIR methods based on multiple image registration data generated by human readers and automatic DIR methods. To fully account for the locations of landmarks in all images, we treat the true locations of landmarks as latent variables and impose a hierarchical structure on the magnitude of registration errors observed across image pairs. DIR registration errors are modeled using Gaussian processes with reference prior densities on prior parameters that determine the associated covariance matrices. We develop a Gibbs sampling algorithm to efficiently fit our models to high-dimensional data, and apply the proposed method to analyze an image dataset obtained from a 4D thoracic CT study. © 2014, The International Biometric Society.

  15. Process adjustment by a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Daniel Duret

    2015-12-01

    Full Text Available In a production or measure situation, operators are required to make corrections to a process using the measurement of a sample. In both cases, it is always difficult to suggest a correction from a deviation. The correction is the result of two different deviations: one in set-up and the second in production. The latter is considered as noise. The objective of this paper is to propose an original approach to calculate the best correction using a Bayesian approach. A correction formula is given with three assumptions as regards adjusting the distribution: uniform, triangular and normal distribution. This paper gives a graphical interpretation of these different assumptions and a discussion of the results. Based on these results, the paper proposes a practical rule for calculating the most likely maladjustment in the case of a normal distribution. This practical rule gives the best adjustment using a simple relation (Adjustment = K*sample mean where K depends on the sample size, the ratio between the maladjustment and the short-term variability and a Type I risk of large maladjustment.

  16. Abrupt strategy change underlies gradual performance change: Bayesian hierarchical models of component and aggregate strategy use.

    Science.gov (United States)

    Wynton, Sarah K A; Anglim, Jeromy

    2017-10-01

    While researchers have often sought to understand the learning curve in terms of multiple component processes, few studies have measured and mathematically modeled these processes on a complex task. In particular, there remains a need to reconcile how abrupt changes in strategy use can co-occur with gradual changes in task completion time. Thus, the current study aimed to assess the degree to which strategy change was abrupt or gradual, and whether strategy aggregation could partially explain gradual performance change. It also aimed to show how Bayesian methods could be used to model the effect of practice on strategy use. To achieve these aims, 162 participants completed 15 blocks of practice on a complex computer-based task-the Wynton-Anglim booking (WAB) task. The task allowed for multiple component strategies (i.e., memory retrieval, information reduction, and insight) that could also be aggregated to a global measure of strategy use. Bayesian hierarchical models were used to compare abrupt and gradual functions of component and aggregate strategy use. Task completion time was well-modeled by a power function, and global strategy use explained substantial variance in performance. Change in component strategy use tended to be abrupt, whereas change in global strategy use was gradual and well-modeled by a power function. Thus, differential timing of component strategy shifts leads to gradual changes in overall strategy efficiency, and this provides one reason for why smooth learning curves can co-occur with abrupt changes in strategy use. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. A BAYESIAN HIERARCHICAL SPATIAL MODEL FOR DENTAL CARIES ASSESSMENT USING NON-GAUSSIAN MARKOV RANDOM FIELDS.

    Science.gov (United States)

    Jin, Ick Hoon; Yuan, Ying; Bandyopadhyay, Dipankar

    2016-01-01

    Research in dental caries generates data with two levels of hierarchy: that of a tooth overall and that of the different surfaces of the tooth. The outcomes often exhibit spatial referencing among neighboring teeth and surfaces, i.e., the disease status of a tooth or surface might be influenced by the status of a set of proximal teeth/surfaces. Assessments of dental caries (tooth decay) at the tooth level yield binary outcomes indicating the presence/absence of teeth, and trinary outcomes at the surface level indicating healthy, decayed, or filled surfaces. The presence of these mixed discrete responses complicates the data analysis under a unified framework. To mitigate complications, we develop a Bayesian two-level hierarchical model under suitable (spatial) Markov random field assumptions that accommodates the natural hierarchy within the mixed responses. At the first level, we utilize an autologistic model to accommodate the spatial dependence for the tooth-level binary outcomes. For the second level and conditioned on a tooth being non-missing, we utilize a Potts model to accommodate the spatial referencing for the surface-level trinary outcomes. The regression models at both levels were controlled for plausible covariates (risk factors) of caries, and remain connected through shared parameters. To tackle the computational challenges in our Bayesian estimation scheme caused due to the doubly-intractable normalizing constant, we employ a double Metropolis-Hastings sampler. We compare and contrast our model performances to the standard non-spatial (naive) model using a small simulation study, and illustrate via an application to a clinical dataset on dental caries.

  18. Parameterization of aquatic ecosystem functioning and its natural variation: Hierarchical Bayesian modelling of plankton food web dynamics

    Science.gov (United States)

    Norros, Veera; Laine, Marko; Lignell, Risto; Thingstad, Frede

    2017-10-01

    Methods for extracting empirically and theoretically sound parameter values are urgently needed in aquatic ecosystem modelling to describe key flows and their variation in the system. Here, we compare three Bayesian formulations for mechanistic model parameterization that differ in their assumptions about the variation in parameter values between various datasets: 1) global analysis - no variation, 2) separate analysis - independent variation and 3) hierarchical analysis - variation arising from a shared distribution defined by hyperparameters. We tested these methods, using computer-generated and empirical data, coupled with simplified and reasonably realistic plankton food web models, respectively. While all methods were adequate, the simulated example demonstrated that a well-designed hierarchical analysis can result in the most accurate and precise parameter estimates and predictions, due to its ability to combine information across datasets. However, our results also highlighted sensitivity to hyperparameter prior distributions as an important caveat of hierarchical analysis. In the more complex empirical example, hierarchical analysis was able to combine precise identification of parameter values with reasonably good predictive performance, although the ranking of the methods was less straightforward. We conclude that hierarchical Bayesian analysis is a promising tool for identifying key ecosystem-functioning parameters and their variation from empirical datasets.

  19. Epigenetic change detection and pattern recognition via Bayesian hierarchical hidden Markov models.

    Science.gov (United States)

    Wang, Xinlei; Zang, Miao; Xiao, Guanghua

    2013-06-15

    Epigenetics is the study of changes to the genome that can switch genes on or off and determine which proteins are transcribed without altering the DNA sequence. Recently, epigenetic changes have been linked to the development and progression of disease such as psychiatric disorders. High-throughput epigenetic experiments have enabled researchers to measure genome-wide epigenetic profiles and yield data consisting of intensity ratios of immunoprecipitation versus reference samples. The intensity ratios can provide a view of genomic regions where protein binding occur under one experimental condition and further allow us to detect epigenetic alterations through comparison between two different conditions. However, such experiments can be expensive, with only a few replicates available. Moreover, epigenetic data are often spatially correlated with high noise levels. In this paper, we develop a Bayesian hierarchical model, combined with hidden Markov processes with four states for modeling spatial dependence, to detect genomic sites with epigenetic changes from two-sample experiments with paired internal control. One attractive feature of the proposed method is that the four states of the hidden Markov process have well-defined biological meanings and allow us to directly call the change patterns based on the corresponding posterior probabilities. In contrast, none of existing methods can offer this advantage. In addition, the proposed method offers great power in statistical inference by spatial smoothing (via hidden Markov modeling) and information pooling (via hierarchical modeling). Both simulation studies and real data analysis in a cocaine addiction study illustrate the reliability and success of this method. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum

    Science.gov (United States)

    Weitzel, Nils; Hense, Andreas; Ohlwein, Christian

    2017-04-01

    Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were

  1. Tools for predicting rainfall from lightning records: events identification and rain prediction using a Bayesian hierarchical model

    OpenAIRE

    Di Giuseppe, Edmondo; Lasinio, Giovanna Jona; Pasqui, Massimiliano; Esposito, Stanislao

    2015-01-01

    We propose a new statistical protocol for the estimation of precipitation using lightning data. We first identify rainy events using a scan statistics, then we estimate Rainfall Lighting Ratio (RLR) to convert lightning number into rain volume given the storm intensity. Then we build a hierarchical Bayesian model aiming at the prediction of 15- and 30-minutes cumulated precipitation at unobserved locations and time using information on lightning in the same area. More specifically, we build a...

  2. Corridor-level signalized intersection safety analysis in Shanghai, China using Bayesian hierarchical models.

    Science.gov (United States)

    Xie, Kun; Wang, Xuesong; Huang, Helai; Chen, Xiaohong

    2013-01-01

    Most traffic crashes in Chinese cities occur at signalized intersections. Research on the intersection safety problem in China is still in its early stage. The recent development of an advanced traffic information system in Shanghai enables in-depth intersection safety analyses using road design, traffic operation, and crash data. In Shanghai, the road network density is relatively high and the distance between signalized intersections is small, averaging about 200m. Adjacent signalized intersections located along the same corridor share similar traffic flows, and signals are usually coordinated. Therefore, when studying intersection safety in Shanghai, it is essential to account for intersection correlations within corridors. In this study, data for 195 signalized intersections along 22 corridors in the urban areas of Shanghai were collected. Mean speeds and speed variances of corridors were acquired from taxis equipped with Global Positioning Systems (GPS). Bayesian hierarchical models were applied to identify crash risk factors at both the intersection and the corridor levels. Results showed that intersections along corridors with lower mean speeds were associated with fewer crashes than those with higher speeds, and those intersections along two-way roads, under elevated roads, and in close proximity to each other, tended to have higher crash frequencies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation

    Science.gov (United States)

    Tsai, Frank T.-C.; Elshall, Ahmed S.

    2013-09-01

    Analysts are often faced with competing propositions for each uncertain model component. How can we judge that we select a correct proposition(s) for an uncertain model component out of numerous possible propositions? We introduce the hierarchical Bayesian model averaging (HBMA) method as a multimodel framework for uncertainty analysis. The HBMA allows for segregating, prioritizing, and evaluating different sources of uncertainty and their corresponding competing propositions through a hierarchy of BMA models that forms a BMA tree. We apply the HBMA to conduct uncertainty analysis on the reconstructed hydrostratigraphic architectures of the Baton Rouge aquifer-fault system, Louisiana. Due to uncertainty in model data, structure, and parameters, multiple possible hydrostratigraphic models are produced and calibrated as base models. The study considers four sources of uncertainty. With respect to data uncertainty, the study considers two calibration data sets. With respect to model structure, the study considers three different variogram models, two geological stationarity assumptions and two fault conceptualizations. The base models are produced following a combinatorial design to allow for uncertainty segregation. Thus, these four uncertain model components with their corresponding competing model propositions result in 24 base models. The results show that the systematic dissection of the uncertain model components along with their corresponding competing propositions allows for detecting the robust model propositions and the major sources of uncertainty.

  4. Disease mapping of Leishmaniasis outbreak in Afghanistan: spatial hierarchical Bayesian analysis

    Directory of Open Access Journals (Sweden)

    Oyelola A. Adegboye

    2012-08-01

    Full Text Available Objective: To analyze the spatial pattern of Leishmaniasis disease in Afghanistan, using provincial level geo-referenced data. The disease is contracted through bites from sand flies and is the third most common vector-borne disease. Leishmaniasis is a serious health concern in Afghanistan with about 250 000 estimated new cases of cutaneous infection nationwide and 67,000 cases in Kabul. This makes Kabul the city with the largest incidence of the disease worldwide. Methods: We use a Bayesian hierarchical Poisson model to estimate the influence of hypothesized risk factors on the relative risk of the disease. We use random components to take into account the lack of independence of the risk between adjacent areas. Results: Statistical inference is carried out using Markov Chain Monte Carlo simulation. The final model specification includes altitude, two random components (intercept and slope and utilizes a conditional autoregressive prior with a deviance information criterion of 247.761. Spatial scan statistics confirm disease clusters in the North-Eastern and South-Eastern regions of Afghanistan with a p-value of less than 0.0001. Conclusions: The study confirms disease clusters in the North-Eastern and South-Eastern regions of Afghanistan. Our findings are robust with respect to the specification of the prior distribution and give important insights into the spatial dynamics of Leishmaniasis in Afghanistan.

  5. Estimating the Term Structure With a Semiparametric Bayesian Hierarchical Model: An Application to Corporate Bonds1

    Science.gov (United States)

    Cruz-Marcelo, Alejandro; Ensor, Katherine B.; Rosner, Gary L.

    2011-01-01

    The term structure of interest rates is used to price defaultable bonds and credit derivatives, as well as to infer the quality of bonds for risk management purposes. We introduce a model that jointly estimates term structures by means of a Bayesian hierarchical model with a prior probability model based on Dirichlet process mixtures. The modeling methodology borrows strength across term structures for purposes of estimation. The main advantage of our framework is its ability to produce reliable estimators at the company level even when there are only a few bonds per company. After describing the proposed model, we discuss an empirical application in which the term structure of 197 individual companies is estimated. The sample of 197 consists of 143 companies with only one or two bonds. In-sample and out-of-sample tests are used to quantify the improvement in accuracy that results from approximating the term structure of corporate bonds with estimators by company rather than by credit rating, the latter being a popular choice in the financial literature. A complete description of a Markov chain Monte Carlo (MCMC) scheme for the proposed model is available as Supplementary Material. PMID:21765566

  6. Mapping brucellosis increases relative to elk density using hierarchical Bayesian models

    Science.gov (United States)

    Cross, Paul C.; Heisey, Dennis M.; Scurlock, Brandon M.; Edwards, William H.; Brennan, Angela; Ebinger, Michael R.

    2010-01-01

    The relationship between host density and parasite transmission is central to the effectiveness of many disease management strategies. Few studies, however, have empirically estimated this relationship particularly in large mammals. We applied hierarchical Bayesian methods to a 19-year dataset of over 6400 brucellosis tests of adult female elk (Cervus elaphus) in northwestern Wyoming. Management captures that occurred from January to March were over two times more likely to be seropositive than hunted elk that were killed in September to December, while accounting for site and year effects. Areas with supplemental feeding grounds for elk had higher seroprevalence in 1991 than other regions, but by 2009 many areas distant from the feeding grounds were of comparable seroprevalence. The increases in brucellosis seroprevalence were correlated with elk densities at the elk management unit, or hunt area, scale (mean 2070 km2; range = [95–10237]). The data, however, could not differentiate among linear and non-linear effects of host density. Therefore, control efforts that focus on reducing elk densities at a broad spatial scale were only weakly supported. Additional research on how a few, large groups within a region may be driving disease dynamics is needed for more targeted and effective management interventions. Brucellosis appears to be expanding its range into new regions and elk populations, which is likely to further complicate the United States brucellosis eradication program. This study is an example of how the dynamics of host populations can affect their ability to serve as disease reservoirs.

  7. Mapping brucellosis increases relative to elk density using hierarchical Bayesian models.

    Directory of Open Access Journals (Sweden)

    Paul C Cross

    Full Text Available The relationship between host density and parasite transmission is central to the effectiveness of many disease management strategies. Few studies, however, have empirically estimated this relationship particularly in large mammals. We applied hierarchical Bayesian methods to a 19-year dataset of over 6400 brucellosis tests of adult female elk (Cervus elaphus in northwestern Wyoming. Management captures that occurred from January to March were over two times more likely to be seropositive than hunted elk that were killed in September to December, while accounting for site and year effects. Areas with supplemental feeding grounds for elk had higher seroprevalence in 1991 than other regions, but by 2009 many areas distant from the feeding grounds were of comparable seroprevalence. The increases in brucellosis seroprevalence were correlated with elk densities at the elk management unit, or hunt area, scale (mean 2070 km(2; range = [95-10237]. The data, however, could not differentiate among linear and non-linear effects of host density. Therefore, control efforts that focus on reducing elk densities at a broad spatial scale were only weakly supported. Additional research on how a few, large groups within a region may be driving disease dynamics is needed for more targeted and effective management interventions. Brucellosis appears to be expanding its range into new regions and elk populations, which is likely to further complicate the United States brucellosis eradication program. This study is an example of how the dynamics of host populations can affect their ability to serve as disease reservoirs.

  8. Customer Behavior in Electronic Commerce: A Bayesian Approach

    National Research Council Canada - National Science Library

    Silvana Dakduk; Enrique ter Horst; Zuleyma Santalla; German Molina; José Malavé

    2017-01-01

    .... The main objective of this study is to integrate the theory of planned behavior, the theory of reasoned action, and the technology acceptance model using a Bayesian approach to determine the key...

  9. Regularization of non-homogeneous dynamic Bayesian networks with global information-coupling based on hierarchical Bayesian models

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    To relax the homogeneity assumption of classical dynamic Bayesian networks (DBNs), various recent studies have combined DBNs with multiple changepoint processes. The underlying assumption is that the parameters associated with time series segments delimited by multiple changepoints are a priori

  10. COBRA: a Bayesian approach to pulsar searching

    Science.gov (United States)

    Lentati, L.; Champion, D. J.; Kramer, M.; Barr, E.; Torne, P.

    2018-02-01

    We introduce COBRA, a GPU-accelerated Bayesian analysis package for performing pulsar searching, that uses candidates from traditional search techniques to set the prior used for the periodicity of the source, and performs a blind search in all remaining parameters. COBRA incorporates models for both isolated and accelerated systems, as well as both Keplerian and relativistic binaries, and exploits pulse phase information to combine search epochs coherently, over time, frequency or across multiple telescopes. We demonstrate the efficacy of our approach in a series of simulations that challenge typical search techniques, including highly aliased signals, and relativistic binary systems. In the most extreme case, we simulate an 8 h observation containing 24 orbits of a pulsar in a binary with a 30 M⊙ companion. Even in this scenario we show that we can build up from an initial low-significance candidate, to fully recovering the signal. We also apply the method to survey data of three pulsars from the globular cluster 47Tuc: PSRs J0024-7204D, J0023-7203J and J0024-7204R. This final pulsar is in a 1.6 h binary, the shortest of any pulsar in 47Tuc, and additionally shows significant scintillation. By allowing the amplitude of the source to vary as a function of time, however, we show that we are able to obtain optimal combinations of such noisy data. We also demonstrate the ability of COBRA to perform high-precision pulsar timing directly on the single pulse survey data, and obtain a 95 per cent upper limit on the eccentricity of PSR J0024-7204R of εb < 0.0007.

  11. A Bayesian approach to particle identification in ALICE

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Among the LHC experiments, ALICE has unique particle identification (PID) capabilities exploiting different types of detectors. During Run 1, a Bayesian approach to PID was developed and intensively tested. It facilitates the combination of information from different sub-systems. The adopted methodology and formalism as well as the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE will be reviewed. Results are presented with PID performed via measurements of specific energy loss (dE/dx) and time-of-flight using information from the TPC and TOF detectors, respectively. Methods to extract priors from data and to compare PID efficiencies and misidentification probabilities in data and Monte Carlo using high-purity samples of identified particles will be presented. Bayesian PID results were found consistent with previous measurements published by ALICE. The Bayesian PID approach gives a higher signal-to-background ratio and a similar or larger statist...

  12. Identification of transmissivity fields using a Bayesian strategy and perturbative approach

    Science.gov (United States)

    Zanini, Andrea; Tanda, Maria Giovanna; Woodbury, Allan D.

    2017-10-01

    The paper deals with the crucial problem of the groundwater parameter estimation that is the basis for efficient modeling and reclamation activities. A hierarchical Bayesian approach is developed: it uses the Akaike's Bayesian Information Criteria in order to estimate the hyperparameters (related to the covariance model chosen) and to quantify the unknown noise variance. The transmissivity identification proceeds in two steps: the first, called empirical Bayesian interpolation, uses Y* (Y = lnT) observations to interpolate Y values on a specified grid; the second, called empirical Bayesian update, improve the previous Y estimate through the addition of hydraulic head observations. The relationship between the head and the lnT has been linearized through a perturbative solution of the flow equation. In order to test the proposed approach, synthetic aquifers from literature have been considered. The aquifers in question contain a variety of boundary conditions (both Dirichelet and Neuman type) and scales of heterogeneities (σY2 = 1.0 and σY2 = 5.3). The estimated transmissivity fields were compared to the true one. The joint use of Y* and head measurements improves the estimation of Y considering both degrees of heterogeneity. Even if the variance of the strong transmissivity field can be considered high for the application of the perturbative approach, the results show the same order of approximation of the non-linear methods proposed in literature. The procedure allows to compute the posterior probability distribution of the target quantities and to quantify the uncertainty in the model prediction. Bayesian updating has advantages related both to the Monte-Carlo (MC) and non-MC approaches. In fact, as the MC methods, Bayesian updating allows computing the direct posterior probability distribution of the target quantities and as non-MC methods it has computational times in the order of seconds.

  13. Daniel Goodman’s empirical approach to Bayesian statistics

    Science.gov (United States)

    Gerrodette, Tim; Ward, Eric; Taylor, Rebecca L.; Schwarz, Lisa K.; Eguchi, Tomoharu; Wade, Paul; Himes Boor, Gina

    2016-01-01

    Bayesian statistics, in contrast to classical statistics, uses probability to represent uncertainty about the state of knowledge. Bayesian statistics has often been associated with the idea that knowledge is subjective and that a probability distribution represents a personal degree of belief. Dr. Daniel Goodman considered this viewpoint problematic for issues of public policy. He sought to ground his Bayesian approach in data, and advocated the construction of a prior as an empirical histogram of “similar” cases. In this way, the posterior distribution that results from a Bayesian analysis combined comparable previous data with case-specific current data, using Bayes’ formula. Goodman championed such a data-based approach, but he acknowledged that it was difficult in practice. If based on a true representation of our knowledge and uncertainty, Goodman argued that risk assessment and decision-making could be an exact science, despite the uncertainties. In his view, Bayesian statistics is a critical component of this science because a Bayesian analysis produces the probabilities of future outcomes. Indeed, Goodman maintained that the Bayesian machinery, following the rules of conditional probability, offered the best legitimate inference from available data. We give an example of an informative prior in a recent study of Steller sea lion spatial use patterns in Alaska.

  14. msBayes: Pipeline for testing comparative phylogeographic histories using hierarchical approximate Bayesian computation

    Directory of Open Access Journals (Sweden)

    Takebayashi Naoki

    2007-07-01

    Full Text Available Abstract Background Although testing for simultaneous divergence (vicariance across different population-pairs that span the same barrier to gene flow is of central importance to evolutionary biology, researchers often equate the gene tree and population/species tree thereby ignoring stochastic coalescent variance in their conclusions of temporal incongruence. In contrast to other available phylogeographic software packages, msBayes is the only one that analyses data from multiple species/population pairs under a hierarchical model. Results msBayes employs approximate Bayesian computation (ABC under a hierarchical coalescent model to test for simultaneous divergence (TSD in multiple co-distributed population-pairs. Simultaneous isolation is tested by estimating three hyper-parameters that characterize the degree of variability in divergence times across co-distributed population pairs while allowing for variation in various within population-pair demographic parameters (sub-parameters that can affect the coalescent. msBayes is a software package consisting of several C and R programs that are run with a Perl "front-end". Conclusion The method reasonably distinguishes simultaneous isolation from temporal incongruence in the divergence of co-distributed population pairs, even with sparse sampling of individuals. Because the estimate step is decoupled from the simulation step, one can rapidly evaluate different ABC acceptance/rejection conditions and the choice of summary statistics. Given the complex and idiosyncratic nature of testing multi-species biogeographic hypotheses, we envision msBayes as a powerful and flexible tool for tackling a wide array of difficult research questions that use population genetic data from multiple co-distributed species. The msBayes pipeline is available for download at http://msbayes.sourceforge.net/ under an open source license (GNU Public License. The msBayes pipeline is comprised of several C and R programs that

  15. Tanzania: A Hierarchical Cluster Analysis Approach | Ngaruko ...

    African Journals Online (AJOL)

    Using survey data from Kibondo district, west Tanzania, we use hierarchical cluster analysis to classify borrower farmers according to their borrowing behaviour into four distinctive clusters. The appreciation of the existence of heterogeneous farmer clusters is vital in forging credit delivery policies that are not only ...

  16. Bayesian approach to magnetotelluric tensor decomposition

    Directory of Open Access Journals (Sweden)

    Michel Menvielle

    2010-05-01

    ;} -->

    Magnetotelluric directional analysis and impedance tensor decomposition are basic tools to validate a local/regional composite electrical model of the underlying structure. Bayesian stochastic methods approach the problem of the parameter estimation and their uncertainty characterization in a fully probabilistic fashion, through the use of posterior model probabilities.We use the standard Groom­Bailey 3­D local/2­D regional composite model in our bayesian approach. We assume that the experimental impedance estimates are contamined with the Gaussian noise and define the likelihood of a particular composite model with respect to the observed data. We use non­informative, flat priors over physically reasonable intervals for the standard Groom­Bailey decomposition parameters. We apply two numerical methods, the Markov chain Monte Carlo procedure based on the Gibbs sampler and a single­component adaptive Metropolis algorithm. From the posterior samples, we characterize the estimates and uncertainties of the individual decomposition parameters by using the respective marginal posterior probabilities. We conclude that the stochastic scheme performs reliably for a variety of models, including the multisite and multifrequency case with up to

  17. Personalized Audio Systems - a Bayesian Approach

    DEFF Research Database (Denmark)

    Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen

    2013-01-01

    , the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is selected...... are optimized using the proposed framework. Twelve test subjects obtain a personalized setting with the framework, and these settings are signicantly preferred to those obtained with random experimentation....

  18. A hierarchical Bayesian model for understanding the spatiotemporal dynamics of the intestinal epithelium.

    Science.gov (United States)

    Maclaren, Oliver J; Parker, Aimée; Pin, Carmen; Carding, Simon R; Watson, Alastair J M; Fletcher, Alexander G; Byrne, Helen M; Maini, Philip K

    2017-07-01

    Our work addresses two key challenges, one biological and one methodological. First, we aim to understand how proliferation and cell migration rates in the intestinal epithelium are related under healthy, damaged (Ara-C treated) and recovering conditions, and how these relations can be used to identify mechanisms of repair and regeneration. We analyse new data, presented in more detail in a companion paper, in which BrdU/IdU cell-labelling experiments were performed under these respective conditions. Second, in considering how to more rigorously process these data and interpret them using mathematical models, we use a probabilistic, hierarchical approach. This provides a best-practice approach for systematically modelling and understanding the uncertainties that can otherwise undermine the generation of reliable conclusions-uncertainties in experimental measurement and treatment, difficult-to-compare mathematical models of underlying mechanisms, and unknown or unobserved parameters. Both spatially discrete and continuous mechanistic models are considered and related via hierarchical conditional probability assumptions. We perform model checks on both in-sample and out-of-sample datasets and use them to show how to test possible model improvements and assess the robustness of our conclusions. We conclude, for the present set of experiments, that a primarily proliferation-driven model suffices to predict labelled cell dynamics over most time-scales.

  19. A hierarchical Bayesian model for understanding the spatiotemporal dynamics of the intestinal epithelium.

    Directory of Open Access Journals (Sweden)

    Oliver J Maclaren

    2017-07-01

    Full Text Available Our work addresses two key challenges, one biological and one methodological. First, we aim to understand how proliferation and cell migration rates in the intestinal epithelium are related under healthy, damaged (Ara-C treated and recovering conditions, and how these relations can be used to identify mechanisms of repair and regeneration. We analyse new data, presented in more detail in a companion paper, in which BrdU/IdU cell-labelling experiments were performed under these respective conditions. Second, in considering how to more rigorously process these data and interpret them using mathematical models, we use a probabilistic, hierarchical approach. This provides a best-practice approach for systematically modelling and understanding the uncertainties that can otherwise undermine the generation of reliable conclusions-uncertainties in experimental measurement and treatment, difficult-to-compare mathematical models of underlying mechanisms, and unknown or unobserved parameters. Both spatially discrete and continuous mechanistic models are considered and related via hierarchical conditional probability assumptions. We perform model checks on both in-sample and out-of-sample datasets and use them to show how to test possible model improvements and assess the robustness of our conclusions. We conclude, for the present set of experiments, that a primarily proliferation-driven model suffices to predict labelled cell dynamics over most time-scales.

  20. Model-Based Assessment of Alternative Study Designs in Pediatric Trials. Part II: Bayesian Approaches.

    Science.gov (United States)

    Smania, G; Baiardi, P; Ceci, A; Cella, M; Magni, P

    2016-08-01

    This study presents a pharmacokinetic-pharmacodynamic based clinical trial simulation framework for evaluating the performance of a fixed-sample Bayesian design (BD) and two alternative Bayesian sequential designs (BSDs) (i.e., a non-hierarchical (NON-H) and a semi-hierarchical (SEMI-H) one). Prior information was elicited from adult trials and weighted based on the expected similarity of response to treatment between the pediatric and adult populations. Study designs were evaluated in terms of: type I and II errors, sample size per arm (SS), trial duration (TD), and estimate precision. No substantial differences were observed between NON-H and SEMI-H. BSDs require, on average, smaller SS and TD compared to the BD, which, on the other hand, guarantees higher estimate precision. When large differences between children and adults are expected, BSDs can return very large SS. Bayesian approaches appear to outperform their frequentist counterparts in the design of pediatric trials even when little weight is given to prior information from adults. © 2016 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  1. Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach

    Science.gov (United States)

    Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan

    2013-01-01

    Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…

  2. Quantifying inter- and intra-population niche variability using hierarchical bayesian stable isotope mixing models.

    Science.gov (United States)

    Semmens, Brice X; Ward, Eric J; Moore, Jonathan W; Darimont, Chris T

    2009-07-09

    Variability in resource use defines the width of a trophic niche occupied by a population. Intra-population variability in resource use may occur across hierarchical levels of population structure from individuals to subpopulations. Understanding how levels of population organization contribute to population niche width is critical to ecology and evolution. Here we describe a hierarchical stable isotope mixing model that can simultaneously estimate both the prey composition of a consumer diet and the diet variability among individuals and across levels of population organization. By explicitly estimating variance components for multiple scales, the model can deconstruct the niche width of a consumer population into relevant levels of population structure. We apply this new approach to stable isotope data from a population of gray wolves from coastal British Columbia, and show support for extensive intra-population niche variability among individuals, social groups, and geographically isolated subpopulations. The analytic method we describe improves mixing models by accounting for diet variability, and improves isotope niche width analysis by quantitatively assessing the contribution of levels of organization to the niche width of a population.

  3. Quantifying inter- and intra-population niche variability using hierarchical bayesian stable isotope mixing models.

    Directory of Open Access Journals (Sweden)

    Brice X Semmens

    Full Text Available Variability in resource use defines the width of a trophic niche occupied by a population. Intra-population variability in resource use may occur across hierarchical levels of population structure from individuals to subpopulations. Understanding how levels of population organization contribute to population niche width is critical to ecology and evolution. Here we describe a hierarchical stable isotope mixing model that can simultaneously estimate both the prey composition of a consumer diet and the diet variability among individuals and across levels of population organization. By explicitly estimating variance components for multiple scales, the model can deconstruct the niche width of a consumer population into relevant levels of population structure. We apply this new approach to stable isotope data from a population of gray wolves from coastal British Columbia, and show support for extensive intra-population niche variability among individuals, social groups, and geographically isolated subpopulations. The analytic method we describe improves mixing models by accounting for diet variability, and improves isotope niche width analysis by quantitatively assessing the contribution of levels of organization to the niche width of a population.

  4. Bayesian approach to inverse statistical mechanics

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  5. A Bayesian Networks approach to Operational Risk

    Science.gov (United States)

    Aquaro, V.; Bardoscia, M.; Bellotti, R.; Consiglio, A.; De Carlo, F.; Ferri, G.

    2010-04-01

    A system for Operational Risk management based on the computational paradigm of Bayesian Networks is presented. The algorithm allows the construction of a Bayesian Network targeted for each bank and takes into account in a simple and realistic way the correlations among different processes of the bank. The internal losses are averaged over a variable time horizon, so that the correlations at different times are removed, while the correlations at the same time are kept: the averaged losses are thus suitable to perform the learning of the network topology and parameters; since the main aim is to understand the role of the correlations among the losses, the assessments of domain experts are not used. The algorithm has been validated on synthetic time series. It should be stressed that the proposed algorithm has been thought for the practical implementation in a mid or small sized bank, since it has a small impact on the organizational structure of a bank and requires an investment in human resources which is limited to the computational area.

  6. A non-parametric Bayesian approach to spike sorting.

    Science.gov (United States)

    Wood, Frank; Goldwater, Sharon; Black, Michael J

    2006-01-01

    In this work we present and apply infinite Gaussian mixture modeling, a non-parametric Bayesian method, to the problem of spike sorting. As this approach is Bayesian, it allows us to integrate prior knowledge about the problem in a principled way. Because it is non-parametric we are able to avoid model selection, a difficult problem that most current spike sorting methods do not address. We compare this approach to using penalized log likelihood to select the best from multiple finite mixture models trained by expectation maximization. We show favorable offline sorting results on real data and discuss ways to extend our model to online applications.

  7. Estimation of Coast-Wide Population Trends of Marbled Murrelets in Canada Using a Bayesian Hierarchical Model.

    Directory of Open Access Journals (Sweden)

    Douglas F Bertram

    Full Text Available Species at risk with secretive breeding behaviours, low densities, and wide geographic range pose a significant challenge to conservation actions because population trends are difficult to detect. Such is the case with the Marbled Murrelet (Brachyramphus marmoratus, a seabird listed as 'Threatened' by the Species at Risk Act in Canada largely due to the loss of its old growth forest nesting habitat. We report the first estimates of population trend of Marbled Murrelets in Canada derived from a monitoring program that uses marine radar to detect birds as they enter forest watersheds during 923 dawn surveys at 58 radar monitoring stations within the six Marbled Murrelet Conservation Regions on coastal British Columbia, Canada, 1996-2013. Temporal trends in radar counts were analyzed with a hierarchical Bayesian multivariate modeling approach that controlled for variation in tilt of the radar unit and day of year, included year-specific deviations from the overall trend ('year effects', and allowed for trends to be estimated at three spatial scales. A negative overall trend of -1.6%/yr (95% credibility interval: -3.2%, 0.01% indicated moderate evidence for a coast-wide decline, although trends varied strongly among the six conservation regions. Negative annual trends were detected in East Vancouver Island (-9%/yr and South Mainland Coast (-3%/yr Conservation Regions. Over a quarter of the year effects were significantly different from zero, and the estimated standard deviation in common-shared year effects between sites within each region was about 50% per year. This large common-shared interannual variation in counts may have been caused by regional movements of birds related to changes in marine conditions that affect the availability of prey.

  8. Global Trends and Factors Associated with the Illegal Killing of Elephants: A Hierarchical Bayesian Analysis of Carcass Encounter Data

    Science.gov (United States)

    Burn, Robert W.; Underwood, Fiona M.; Blanc, Julian

    2011-01-01

    Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002–2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a Bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process. PMID:21912670

  9. Inferring cetacean population densities from the absolute dynamic topography of the ocean in a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    Mario A Pardo

    Full Text Available We inferred the population densities of blue whales (Balaenoptera musculus and short-beaked common dolphins (Delphinus delphis in the Northeast Pacific Ocean as functions of the water-column's physical structure by implementing hierarchical models in a Bayesian framework. This approach allowed us to propagate the uncertainty of the field observations into the inference of species-habitat relationships and to generate spatially explicit population density predictions with reduced effects of sampling heterogeneity. Our hypothesis was that the large-scale spatial distributions of these two cetacean species respond primarily to ecological processes resulting from shoaling and outcropping of the pycnocline in regions of wind-forced upwelling and eddy-like circulation. Physically, these processes affect the thermodynamic balance of the water column, decreasing its volume and thus the height of the absolute dynamic topography (ADT. Biologically, they lead to elevated primary productivity and persistent aggregation of low-trophic-level prey. Unlike other remotely sensed variables, ADT provides information about the structure of the entire water column and it is also routinely measured at high spatial-temporal resolution by satellite altimeters with uniform global coverage. Our models provide spatially explicit population density predictions for both species, even in areas where the pycnocline shoals but does not outcrop (e.g. the Costa Rica Dome and the North Equatorial Countercurrent thermocline ridge. Interannual variations in distribution during El Niño anomalies suggest that the population density of both species decreases dramatically in the Equatorial Cold Tongue and the Costa Rica Dome, and that their distributions retract to particular areas that remain productive, such as the more oceanic waters in the central California Current System, the northern Gulf of California, the North Equatorial Countercurrent thermocline ridge, and the more

  10. Global trends and factors associated with the illegal killing of elephants: A hierarchical bayesian analysis of carcass encounter data.

    Directory of Open Access Journals (Sweden)

    Robert W Burn

    Full Text Available Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES. Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE, set up by the 10(th Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.

  11. Estimation of Coast-Wide Population Trends of Marbled Murrelets in Canada Using a Bayesian Hierarchical Model.

    Science.gov (United States)

    Bertram, Douglas F; Drever, Mark C; McAllister, Murdoch K; Schroeder, Bernard K; Lindsay, David J; Faust, Deborah A

    2015-01-01

    Species at risk with secretive breeding behaviours, low densities, and wide geographic range pose a significant challenge to conservation actions because population trends are difficult to detect. Such is the case with the Marbled Murrelet (Brachyramphus marmoratus), a seabird listed as 'Threatened' by the Species at Risk Act in Canada largely due to the loss of its old growth forest nesting habitat. We report the first estimates of population trend of Marbled Murrelets in Canada derived from a monitoring program that uses marine radar to detect birds as they enter forest watersheds during 923 dawn surveys at 58 radar monitoring stations within the six Marbled Murrelet Conservation Regions on coastal British Columbia, Canada, 1996-2013. Temporal trends in radar counts were analyzed with a hierarchical Bayesian multivariate modeling approach that controlled for variation in tilt of the radar unit and day of year, included year-specific deviations from the overall trend ('year effects'), and allowed for trends to be estimated at three spatial scales. A negative overall trend of -1.6%/yr (95% credibility interval: -3.2%, 0.01%) indicated moderate evidence for a coast-wide decline, although trends varied strongly among the six conservation regions. Negative annual trends were detected in East Vancouver Island (-9%/yr) and South Mainland Coast (-3%/yr) Conservation Regions. Over a quarter of the year effects were significantly different from zero, and the estimated standard deviation in common-shared year effects between sites within each region was about 50% per year. This large common-shared interannual variation in counts may have been caused by regional movements of birds related to changes in marine conditions that affect the availability of prey.

  12. Global trends and factors associated with the illegal killing of elephants: A hierarchical bayesian analysis of carcass encounter data.

    Science.gov (United States)

    Burn, Robert W; Underwood, Fiona M; Blanc, Julian

    2011-01-01

    Elephant poaching and the ivory trade remain high on the agenda at meetings of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES). Well-informed debates require robust estimates of trends, the spatial distribution of poaching, and drivers of poaching. We present an analysis of trends and drivers of an indicator of elephant poaching of all elephant species. The site-based monitoring system known as Monitoring the Illegal Killing of Elephants (MIKE), set up by the 10(th) Conference of the Parties of CITES in 1997, produces carcass encounter data reported mainly by anti-poaching patrols. Data analyzed were site by year totals of 6,337 carcasses from 66 sites in Africa and Asia from 2002-2009. Analysis of these observational data is a serious challenge to traditional statistical methods because of the opportunistic and non-random nature of patrols, and the heterogeneity across sites. Adopting a bayesian hierarchical modeling approach, we used the proportion of carcasses that were illegally killed (PIKE) as a poaching index, to estimate the trend and the effects of site- and country-level factors associated with poaching. Important drivers of illegal killing that emerged at country level were poor governance and low levels of human development, and at site level, forest cover and area of the site in regions where human population density is low. After a drop from 2002, PIKE remained fairly constant from 2003 until 2006, after which it increased until 2008. The results for 2009 indicate a decline. Sites with PIKE ranging from the lowest to the highest were identified. The results of the analysis provide a sound information base for scientific evidence-based decision making in the CITES process.

  13. Internal cycling, not external loading, decides the nutrient limitation in eutrophic lake: A dynamic model with temporal Bayesian hierarchical inference.

    Science.gov (United States)

    Wu, Zhen; Liu, Yong; Liang, Zhongyao; Wu, Sifeng; Guo, Huaicheng

    2017-06-01

    Lake eutrophication is associated with excessive anthropogenic nutrients (mainly nitrogen (N) and phosphorus (P)) and unobserved internal nutrient cycling. Despite the advances in understanding the role of external loadings, the contribution of internal nutrient cycling is still an open question. A dynamic mass-balance model was developed to simulate and measure the contributions of internal cycling and external loading. It was based on the temporal Bayesian Hierarchical Framework (BHM), where we explored the seasonal patterns in the dynamics of nutrient cycling processes and the limitation of N and P on phytoplankton growth in hyper-eutrophic Lake Dianchi, China. The dynamic patterns of the five state variables (Chla, TP, ammonia, nitrate and organic N) were simulated based on the model. Five parameters (algae growth rate, sediment exchange rate of N and P, nitrification rate and denitrification rate) were estimated based on BHM. The model provided a good fit to observations. Our model results highlighted the role of internal cycling of N and P in Lake Dianchi. The internal cycling processes contributed more than external loading to the N and P changes in the water column. Further insights into the nutrient limitation analysis indicated that the sediment exchange of P determined the P limitation. Allowing for the contribution of denitrification to N removal, N was the more limiting nutrient in most of the time, however, P was the more important nutrient for eutrophication management. For Lake Dianchi, it would not be possible to recover solely by reducing the external watershed nutrient load; the mechanisms of internal cycling should also be considered as an approach to inhibit the release of sediments and to enhance denitrification. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. The effect of suspended sediment on fertilization success in the urchin Evechinus chloroticus: analysis of experimental data using hierarchical Bayesian methods.

    Science.gov (United States)

    Miller, S L; Richardson, K; Edwards, P A

    2014-11-15

    Terrestrial sediments are a significant stressor on coastal ecosystems, with both suspended and deposited sediment having adverse effects on aquatic organisms. However, information on the effect of suspended sediments on fertilization success for urchin species is lacking. Using sediment levels similar to those encountered in situ, a controlled experiment was conducted to test whether suspended sediment affects fertilization success in the urchin Evechinus chloroticus. Analyses used generalized linear mixed models (GLMMs) and hierarchical Bayesian (HB) regression. Both approaches showed a significant decrease in fertilization success with increased suspended sediment levels. Uncertainties in estimates were narrower for HB models, suggesting that this approach has advantages over GLMMs for sparse data problems sometimes encountered in laboratory experiments. Given future global change scenarios, this work is important for predicting the effects of stressors such as sedimentation that may ultimately impact marine populations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. A Predictive Likelihood Approach to Bayesian Averaging

    Directory of Open Access Journals (Sweden)

    Tomáš Jeřábek

    2015-01-01

    Full Text Available Multivariate time series forecasting is applied in a wide range of economic activities related to regional competitiveness and is the basis of almost all macroeconomic analysis. In this paper we combine multivariate density forecasts of GDP growth, inflation and real interest rates from four various models, two type of Bayesian vector autoregression (BVAR models, a New Keynesian dynamic stochastic general equilibrium (DSGE model of small open economy and DSGE-VAR model. The performance of models is identified using historical dates including domestic economy and foreign economy, which is represented by countries of the Eurozone. Because forecast accuracy of observed models are different, the weighting scheme based on the predictive likelihood, the trace of past MSE matrix, model ranks are used to combine the models. The equal-weight scheme is used as a simple combination scheme. The results show that optimally combined densities are comparable to the best individual models.

  16. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    This work aims to highlight prerogatives and advantages of the Bayesian approach in the reliability studies of the modern power electrical systems. The new organization of the electric energy sector and the consistent degree of technological innovation make the data more uncertain related to the operation of the electric ...

  17. Bayesian ensemble approach to error estimation of interatomic potentials

    DEFF Research Database (Denmark)

    Frederiksen, Søren Lund; Jacobsen, Karsten Wedel; Brown, K.S.

    2004-01-01

    Using a Bayesian approach a general method is developed to assess error bars on predictions made by models fitted to data. The error bars are estimated from fluctuations in ensembles of models sampling the model-parameter space with a probability density set by the minimum cost. The method is app...

  18. A Bayesian approach to the Japanese Black cattle carcass genetic ...

    African Journals Online (AJOL)

    p4361148

    Peer-reviewed paper: 10th World Conference on Animal Production. 77. A Bayesian approach to the Japanese Black cattle carcass genetic evaluation. A. Arakawa. 1#. , H. Iwaisaki. 1 and K. Anada. 2. 1 Graduate School of Agriculture, Division of Applied Biosciences, Kyoto University, Kyoto 606-8502, Japan. 2 Wagyu ...

  19. A Bayesian approach to combining animal abundance and demographic data

    Directory of Open Access Journals (Sweden)

    Brooks, S. P.

    2004-06-01

    Full Text Available In studies of wild animals, one frequently encounters both count and mark-recapture-recovery data. Here, we consider an integrated Bayesian analysis of ring¿recovery and count data using a state-space model. We then impose a Leslie-matrix-based model on the true population counts describing the natural birth-death and age transition processes. We focus upon the analysis of both count and recovery data collected on British lapwings (Vanellus vanellus combined with records of the number of frost days each winter. We demonstrate how the combined analysis of these data provides a more robust inferential framework and discuss how the Bayesian approach using MCMC allows us to remove the potentially restrictive normality assumptions commonly assumed for analyses of this sort. It is shown how WinBUGS may be used to perform the Bayesian analysis. WinBUGS code is provided and its performance is critically discussed.

  20. Evaluating impacts using a BACI design, ratios, and a Bayesian approach with a focus on restoration.

    Science.gov (United States)

    Conner, Mary M; Saunders, W Carl; Bouwes, Nicolaas; Jordan, Chris

    2015-10-01

    Before-after-control-impact (BACI) designs are an effective method to evaluate natural and human-induced perturbations on ecological variables when treatment sites cannot be randomly chosen. While effect sizes of interest can be tested with frequentist methods, using Bayesian Markov chain Monte Carlo (MCMC) sampling methods, probabilities of effect sizes, such as a ≥20 % increase in density after restoration, can be directly estimated. Although BACI and Bayesian methods are used widely for assessing natural and human-induced impacts for field experiments, the application of hierarchal Bayesian modeling with MCMC sampling to BACI designs is less common. Here, we combine these approaches and extend the typical presentation of results with an easy to interpret ratio, which provides an answer to the main study question-"How much impact did a management action or natural perturbation have?" As an example of this approach, we evaluate the impact of a restoration project, which implemented beaver dam analogs, on survival and density of juvenile steelhead. Results indicated the probabilities of a ≥30 % increase were high for survival and density after the dams were installed, 0.88 and 0.99, respectively, while probabilities for a higher increase of ≥50 % were variable, 0.17 and 0.82, respectively. This approach demonstrates a useful extension of Bayesian methods that can easily be generalized to other study designs from simple (e.g., single factor ANOVA, paired t test) to more complicated block designs (e.g., crossover, split-plot). This approach is valuable for estimating the probabilities of restoration impacts or other management actions.

  1. A hierarchical method for Bayesian inference of rate parameters from shock tube data: Application to the study of the reaction of hydroxyl with 2-methylfuran

    KAUST Repository

    Kim, Daesang

    2017-06-22

    We developed a novel two-step hierarchical method for the Bayesian inference of the rate parameters of a target reaction from time-resolved concentration measurements in shock tubes. The method was applied to the calibration of the parameters of the reaction of hydroxyl with 2-methylfuran, which is studied experimentally via absorption measurements of the OH radical\\'s concentration following shock-heating. In the first step of the approach, each shock tube experiment is treated independently to infer the posterior distribution of the rate constant and error hyper-parameter that best explains the OH signal. In the second step, these posterior distributions are sampled to calibrate the parameters appearing in the Arrhenius reaction model for the rate constant. Furthermore, the second step is modified and repeated in order to explore alternative rate constant models and to assess the effect of uncertainties in the reflected shock\\'s temperature. Comparisons of the estimates obtained via the proposed methodology against the common least squares approach are presented. The relative merits of the novel Bayesian framework are highlighted, especially with respect to the opportunity to utilize the posterior distributions of the parameters in future uncertainty quantification studies.

  2. Econometric Assessment of Research Programs: A Bayesian Approach

    OpenAIRE

    Qin, Lin; Buccola, Steven T.

    2012-01-01

    Effective research-project assessment typically is impeded by project variety. In particular, bibliometric approaches to science assessment tend to offer little information about the content of the projects examined. We introduce here a new approach – based on Bayesian theory – of econometrically evaluating the factors affecting scientific discovery, and use the method to assess a biological research program comprised of numerous heterogeneous projects. Our knowledge metric not only flexibly ...

  3. A full Bayesian hierarchical mixture model for the variance of gene differential expression

    Directory of Open Access Journals (Sweden)

    Walls Rebecca E

    2007-04-01

    Full Text Available Abstract Background In many laboratory-based high throughput microarray experiments, there are very few replicates of gene expression levels. Thus, estimates of gene variances are inaccurate. Visual inspection of graphical summaries of these data usually reveals that heteroscedasticity is present, and the standard approach to address this is to take a log2 transformation. In such circumstances, it is then common to assume that gene variability is constant when an analysis of these data is undertaken. However, this is perhaps too stringent an assumption. More careful inspection reveals that the simple log2 transformation does not remove the problem of heteroscedasticity. An alternative strategy is to assume independent gene-specific variances; although again this is problematic as variance estimates based on few replications are highly unstable. More meaningful and reliable comparisons of gene expression might be achieved, for different conditions or different tissue samples, where the test statistics are based on accurate estimates of gene variability; a crucial step in the identification of differentially expressed genes. Results We propose a Bayesian mixture model, which classifies genes according to similarity in their variance. The result is that genes in the same latent class share the similar variance, estimated from a larger number of replicates than purely those per gene, i.e. the total of all replicates of all genes in the same latent class. An example dataset, consisting of 9216 genes with four replicates per condition, resulted in four latent classes based on their similarity of the variance. Conclusion The mixture variance model provides a realistic and flexible estimate for the variance of gene expression data under limited replicates. We believe that in using the latent class variances, estimated from a larger number of genes in each derived latent group, the p-values obtained are more robust than either using a constant gene or

  4. A BAYESIAN HIERARCHICAL SPATIAL POINT PROCESS MODEL FOR MULTI-TYPE NEUROIMAGING META-ANALYSIS.

    Science.gov (United States)

    Kang, Jian; Nichols, Thomas E; Wager, Tor D; Johnson, Timothy D

    2014-09-01

    Neuroimaging meta-analysis is an important tool for finding consistent effects over studies that each usually have 20 or fewer subjects. Interest in meta-analysis in brain mapping is also driven by a recent focus on so-called "reverse inference": where as traditional "forward inference" identifies the regions of the brain involved in a task, a reverse inference identifies the cognitive processes that a task engages. Such reverse inferences, however, requires a set of meta-analysis, one for each possible cognitive domain. However, existing methods for neuroimaging meta-analysis have significant limitations. Commonly used methods for neuroimaging meta-analysis are not model based, do not provide interpretable parameter estimates, and only produce null hypothesis inferences; further, they are generally designed for a single group of studies and cannot produce reverse inferences. In this work we address these limitations by adopting a non-parametric Bayesian approach for meta analysis data from multiple classes or types of studies. In particular, foci from each type of study are modeled as a cluster process driven by a random intensity function that is modeled as a kernel convolution of a gamma random field. The type-specific gamma random fields are linked and modeled as a realization of a common gamma random field, shared by all types, that induces correlation between study types and mimics the behavior of a univariate mixed effects model. We illustrate our model on simulation studies and a meta analysis of five emotions from 219 studies and check model fit by a posterior predictive assessment. In addition, we implement reverse inference by using the model to predict study type from a newly presented study. We evaluate this predictive performance via leave-one-out cross validation that is efficiently implemented using importance sampling techniques.

  5. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  6. Stochastic model updating utilizing Bayesian approach and Gaussian process model

    Science.gov (United States)

    Wan, Hua-Ping; Ren, Wei-Xin

    2016-03-01

    Stochastic model updating (SMU) has been increasingly applied in quantifying structural parameter uncertainty from responses variability. SMU for parameter uncertainty quantification refers to the problem of inverse uncertainty quantification (IUQ), which is a nontrivial task. Inverse problem solved with optimization usually brings about the issues of gradient computation, ill-conditionedness, and non-uniqueness. Moreover, the uncertainty present in response makes the inverse problem more complicated. In this study, Bayesian approach is adopted in SMU for parameter uncertainty quantification. The prominent strength of Bayesian approach for IUQ problem is that it solves IUQ problem in a straightforward manner, which enables it to avoid the previous issues. However, when applied to engineering structures that are modeled with a high-resolution finite element model (FEM), Bayesian approach is still computationally expensive since the commonly used Markov chain Monte Carlo (MCMC) method for Bayesian inference requires a large number of model runs to guarantee the convergence. Herein we reduce computational cost in two aspects. On the one hand, the fast-running Gaussian process model (GPM) is utilized to approximate the time-consuming high-resolution FEM. On the other hand, the advanced MCMC method using delayed rejection adaptive Metropolis (DRAM) algorithm that incorporates local adaptive strategy with global adaptive strategy is employed for Bayesian inference. In addition, we propose the use of the powerful variance-based global sensitivity analysis (GSA) in parameter selection to exclude non-influential parameters from calibration parameters, which yields a reduced-order model and thus further alleviates the computational burden. A simulated aluminum plate and a real-world complex cable-stayed pedestrian bridge are presented to illustrate the proposed framework and verify its feasibility.

  7. Improving waterfowl population estimates using hierarchical models: A new approach

    OpenAIRE

    Barker, Nicole; Cumming, Steve; Darveau, Marcel

    2013-01-01

    Recommended citation: Barker, N. K. S., S. G. Cumming, and M. Darveau. 2013. Improving waterfowl population estimates using hierarchical models: A new approach. Poster, Ecology and Conservation of North American Waterfowl. Memphis, TN, USA. Retrieved from figshare: http://dx.doi.org/10.6084/m9.figshare.658776.

  8. SNP based heritability estimation using a Bayesian approach

    DEFF Research Database (Denmark)

    Krag, Kristian; Janss, Luc; Mahdi Shariati, Mohammad

    2013-01-01

    of 0.05, all models had difficulties in estimating the true heritability. The two Bayesian models were compared with a restricted maximum likelihood (REML) approach using a genomic relationship matrix. The comparison showed that the Bayesian approaches performed equally well as the REML approach......Heritability is a central element in quantitative genetics. New molecular markers to assess genetic variance and heritability are continually under development. The availability of molecular single nucleotide polymorphism (SNP) markers can be applied for estimation of variance components....... Differences in family structure were in general not found to influence the estimation of the heritability. For the sample sizes used in this study, a 10-fold increase of SNP density did not improve precision estimates compared with set-ups with a less dense distribution of SNPs. The methods used in this study...

  9. A Bayesian approach to simultaneously quantify assignments and linguistic uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Booker, Jane M [BOOKER SCIENTIFIC FREDERICKSBURG; Ross, Timothy J [UNM

    2010-10-07

    Subject matter expert assessments can include both assignment and linguistic uncertainty. This paper examines assessments containing linguistic uncertainty associated with a qualitative description of a specific state of interest and the assignment uncertainty associated with assigning a qualitative value to that state. A Bayesian approach is examined to simultaneously quantify both assignment and linguistic uncertainty in the posterior probability. The approach is applied to a simplified damage assessment model involving both assignment and linguistic uncertainty. The utility of the approach and the conditions under which the approach is feasible are examined and identified.

  10. Sparse Estimation Using Bayesian Hierarchical Prior Modeling for Real and Complex Linear Models

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Badiu, Mihai Alin

    2015-01-01

    In sparse Bayesian learning (SBL), Gaussian scale mixtures (GSMs) have been used to model sparsity-inducing priors that realize a class of concave penalty functions for the regression task in real-valued signal models. Motivated by the relative scarcity of formal tools for SBL in complex-valued m...

  11. Source reconstruction accuracy of MEG and EEG Bayesian inversion approaches.

    Directory of Open Access Journals (Sweden)

    Paolo Belardinelli

    Full Text Available Electro- and magnetoencephalography allow for non-invasive investigation of human brain activation and corresponding networks with high temporal resolution. Still, no correct network detection is possible without reliable source localization. In this paper, we examine four different source localization schemes under a common Variational Bayesian framework. A Bayesian approach to the Minimum Norm Model (MNM, an Empirical Bayesian Beamformer (EBB and two iterative Bayesian schemes (Automatic Relevance Determination (ARD and Greedy Search (GS are quantitatively compared. While EBB and MNM each use a single empirical prior, ARD and GS employ a library of anatomical priors that define possible source configurations. The localization performance was investigated as a function of (i the number of sources (one vs. two vs. three, (ii the signal to noise ratio (SNR; 5 levels and (iii the temporal correlation of source time courses (for the cases of two or three sources. We also tested whether the use of additional bilateral priors specifying source covariance for ARD and GS algorithms improved performance. Our results show that MNM proves effective only with single source configurations. EBB shows a spatial accuracy of few millimeters with high SNRs and low correlation between sources. In contrast, ARD and GS are more robust to noise and less affected by temporal correlations between sources. However, the spatial accuracy of ARD and GS is generally limited to the order of one centimeter. We found that the use of correlated covariance priors made no difference to ARD/GS performance.

  12. Comparison of Bayesian and frequentist approaches in modelling risk of preterm birth near the Sydney Tar Ponds, Nova Scotia, Canada

    Directory of Open Access Journals (Sweden)

    Canty Angelo

    2007-09-01

    Full Text Available Abstract Background This study compares the Bayesian and frequentist (non-Bayesian approaches in the modelling of the association between the risk of preterm birth and maternal proximity to hazardous waste and pollution from the Sydney Tar Pond site in Nova Scotia, Canada. Methods The data includes 1604 observed cases of preterm birth out of a total population of 17559 at risk of preterm birth from 144 enumeration districts in the Cape Breton Regional Municipality. Other covariates include the distance from the Tar Pond; the rate of unemployment to population; the proportion of persons who are separated, divorced or widowed; the proportion of persons who have no high school diploma; the proportion of persons living alone; the proportion of single parent families and average income. Bayesian hierarchical Poisson regression, quasi-likelihood Poisson regression and weighted linear regression models were fitted to the data. Results The results of the analyses were compared together with their limitations. Conclusion The results of the weighted linear regression and the quasi-likelihood Poisson regression agrees with the result from the Bayesian hierarchical modelling which incorporates the spatial effects.

  13. A Bayesian approach to optimizing cryopreservation protocols

    Directory of Open Access Journals (Sweden)

    Sammy Sambu

    2015-06-01

    Full Text Available Cryopreservation is beset with the challenge of protocol alignment across a wide range of cell types and process variables. By taking a cross-sectional assessment of previously published cryopreservation data (sample means and standard errors as preliminary meta-data, a decision tree learning analysis (DTLA was performed to develop an understanding of target survival using optimized pruning methods based on different approaches. Briefly, a clear direction on the decision process for selection of methods was developed with key choices being the cooling rate, plunge temperature on the one hand and biomaterial choice, use of composites (sugars and proteins as additional constituents, loading procedure and cell location in 3D scaffolding on the other. Secondly, using machine learning and generalized approaches via the Naïve Bayes Classification (NBC method, these metadata were used to develop posterior probabilities for combinatorial approaches that were implicitly recorded in the metadata. These latter results showed that newer protocol choices developed using probability elicitation techniques can unearth improved protocols consistent with multiple unidimensionally-optimized physical protocols. In conclusion, this article proposes the use of DTLA models and subsequently NBC for the improvement of modern cryopreservation techniques through an integrative approach.

  14. Pollen-Climate Calibration, Characterization of Statistical Uncertainty, and Forward Modeling for Integration Into Bayesian Hierarchical Climate Reconstruction

    Science.gov (United States)

    Wahl, E. R.

    2008-12-01

    A strict process model for pollen as a climate proxy is currently not approachable beyond localized spatial scales; more generally, the canonical model for vegetation-pollen registration itself requires assimilation of empirically-derived information. In this paper, a taxonomically "reduced-space" climate-pollen forward model is developed, based on the performance of a parallel inverse model. The goal is inclusion of the forward model in a Bayesian climate reconstruction framework, following a 4-step process. (1) Ratios of pollen types calibrated to temperature are examined to determine if they can equal or surpass the skill of multi-taxonomic calibrations using the modern analog technique (MAT) optimized with receiver operating characteristic (ROC) analysis. The first phase of this examination, using modern pollen data from SW N America, demonstrates that the ratio method can give calibrations as skillful as the MAT when vegetation representation (and associated climate gradients) are characterized by two dominant pollen taxa, in this case pine and oak. Paleotemperature reconstructions using the ratio method also compare well to MAT reconstructions, showing very minor differences. [Ratio values are defined as pine/(pine + oak), so they vary between 0 and 1.] (2) Uncertainty analysis is carried out in independent steps, which are combined to give overall probabilistic confidence ranges. Monte Carlo (MC) analysis utilizing Poisson distributions to model the inherent variability of pollen representation in relation to climate (assuming defined temperature normals at the modern calibration sites) allows independent statistical estimation of this component of uncertainty, for both the modern calibration and fossil pollen data sets. In turn, MC analysis utilizing normal distributions allows independent estimation of the addition to overall uncertainty from climate variation itself. (3) Because the quality tests in (1) indicate the ratio method has the capacity to carry

  15. Data Assimilation using an Ensemble of Models: A hierarchical approach

    OpenAIRE

    Rayner, Peter

    2017-01-01

    One characteristic of biogeochemical models is uncertainty about their formulation. Data assimilation should take this uncertainty into account. A common approach is to use an ensemble of models. We must assign probabilities not only to the parameters of the models but the models themselves. The method of hierarchical modelling allows us to calculate these probabilities. This paper describes the approach, develops the algebra for the most common case then applies it to the TRANSCO...

  16. A Dynamic Bayesian Approach to Computational Laban Shape Quality Analysis

    Directory of Open Access Journals (Sweden)

    Dilip Swaminathan

    2009-01-01

    kinesiology. LMA (especially Effort/Shape emphasizes how internal feelings and intentions govern the patterning of movement throughout the whole body. As we argue, a complex understanding of intention via LMA is necessary for human-computer interaction to become embodied in ways that resemble interaction in the physical world. We thus introduce a novel, flexible Bayesian fusion approach for identifying LMA Shape qualities from raw motion capture data in real time. The method uses a dynamic Bayesian network (DBN to fuse movement features across the body and across time and as we discuss can be readily adapted for low-cost video. It has delivered excellent performance in preliminary studies comprising improvisatory movements. Our approach has been incorporated in Response, a mixed-reality environment where users interact via natural, full-body human movement and enhance their bodily-kinesthetic awareness through immersive sound and light feedback, with applications to kinesiology training, Parkinson's patient rehabilitation, interactive dance, and many other areas.

  17. A Bayesian sequential processor approach to spectroscopic portal system decisions

    Energy Technology Data Exchange (ETDEWEB)

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  18. A Bayesian inference approach to unveil supply curves in electricity markets

    DEFF Research Database (Denmark)

    Mitridati, Lesia Marie-Jeanne Mariane; Pinson, Pierre

    2017-01-01

    With increased competition in wholesale electricity markets, the need for new decision-making tools for strategic producers has arisen. Optimal bidding strategies have traditionally been modeled as stochastic profit maximization problems. However, for producers with non-negligible market power......, modeling the interactions with rival participants is fundamental. This can be achieved through equilibrium and hierarchical optimization models. The efficiency of these methods relies on the strategic producer's ability to model rival participants' behavior and supply curve. But a substantial gap remains...... in the literature on modeling this uncertainty. In this study we introduce a Bayesian inference approach to reveal the aggregate supply curve in a day-ahead electricity market. The proposed algorithm relies on Markov Chain Monte Carlo and Sequential Monte Carlo methods. The major appeal of this approach...

  19. A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2012-01-01

    A two-step Bayesian propensity score approach is introduced that incorporates prior information in the propensity score equation and outcome equation without the problems associated with simultaneous Bayesian propensity score approaches. The corresponding variance estimators are also provided. The two-step Bayesian propensity score is provided for…

  20. Linking bovine tuberculosis on cattle farms to white-tailed deer and environmental variables using Bayesian hierarchical analysis.

    Directory of Open Access Journals (Sweden)

    W David Walter

    Full Text Available Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles, brushtail possum (Trichosurus vulpecula, and white-tailed deer (Odocoileus virginianus. Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research on M. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type. Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovis identified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd

  1. Analyzing large-scale conservation interventions with Bayesian hierarchical models: a case study of supplementing threatened Pacific salmon.

    Science.gov (United States)

    Scheuerell, Mark D; Buhle, Eric R; Semmens, Brice X; Ford, Michael J; Cooney, Tom; Carmichael, Richard W

    2015-05-01

    Myriad human activities increasingly threaten the existence of many species. A variety of conservation interventions such as habitat restoration, protected areas, and captive breeding have been used to prevent extinctions. Evaluating the effectiveness of these interventions requires appropriate statistical methods, given the quantity and quality of available data. Historically, analysis of variance has been used with some form of predetermined before-after control-impact design to estimate the effects of large-scale experiments or conservation interventions. However, ad hoc retrospective study designs or the presence of random effects at multiple scales may preclude the use of these tools. We evaluated the effects of a large-scale supplementation program on the density of adult Chinook salmon Oncorhynchus tshawytscha from the Snake River basin in the northwestern United States currently listed under the U.S. Endangered Species Act. We analyzed 43 years of data from 22 populations, accounting for random effects across time and space using a form of Bayesian hierarchical time-series model common in analyses of financial markets. We found that varying degrees of supplementation over a period of 25 years increased the density of natural-origin adults, on average, by 0-8% relative to nonsupplementation years. Thirty-nine of the 43 year effects were at least two times larger in magnitude than the mean supplementation effect, suggesting common environmental variables play a more important role in driving interannual variability in adult density. Additional residual variation in density varied considerably across the region, but there was no systematic difference between supplemented and reference populations. Our results demonstrate the power of hierarchical Bayesian models to detect the diffuse effects of management interventions and to quantitatively describe the variability of intervention success. Nevertheless, our study could not address whether ecological factors

  2. A bayesian approach to laboratory utilization management

    Directory of Open Access Journals (Sweden)

    Ronald G Hauser

    2015-01-01

    Full Text Available Background: Laboratory utilization management describes a process designed to increase healthcare value by altering requests for laboratory services. A typical approach to monitor and prioritize interventions involves audits of laboratory orders against specific criteria, defined as rule-based laboratory utilization management. This approach has inherent limitations. First, rules are inflexible. They adapt poorly to the ambiguity of medical decision-making. Second, rules judge the context of a decision instead of the patient outcome allowing an order to simultaneously save a life and break a rule. Third, rules can threaten physician autonomy when used in a performance evaluation. Methods: We developed an alternative to rule-based laboratory utilization. The core idea comes from a formula used in epidemiology to estimate disease prevalence. The equation relates four terms: the prevalence of disease, the proportion of positive tests, test sensitivity and test specificity. When applied to a laboratory utilization audit, the formula estimates the prevalence of disease (pretest probability [PTP] in the patients tested. The comparison of PTPs among different providers, provider groups, or patient cohorts produces an objective evaluation of laboratory requests. We demonstrate the model in a review of tests for enterovirus (EV meningitis. Results: The model identified subpopulations within the cohort with a low prevalence of disease. These low prevalence groups shared demographic and seasonal factors known to protect against EV meningitis. This suggests too many orders occurred from patients at low risk for EV. Conclusion: We introduce a new method for laboratory utilization management programs to audit laboratory services.

  3. Bayesian approach to analyzing holograms of colloidal particles.

    Science.gov (United States)

    Dimiduk, Thomas G; Manoharan, Vinothan N

    2016-10-17

    We demonstrate a Bayesian approach to tracking and characterizing colloidal particles from in-line digital holograms. We model the formation of the hologram using Lorenz-Mie theory. We then use a tempered Markov-chain Monte Carlo method to sample the posterior probability distributions of the model parameters: particle position, size, and refractive index. Compared to least-squares fitting, our approach allows us to more easily incorporate prior information about the parameters and to obtain more accurate uncertainties, which are critical for both particle tracking and characterization experiments. Our approach also eliminates the need to supply accurate initial guesses for the parameters, so it requires little tuning.

  4. Hierarchical Bayesian analysis of outcome- and process-based social preferences and beliefs in Dictator Games and sequential Prisoner's Dilemmas.

    Science.gov (United States)

    Aksoy, Ozan; Weesie, Jeroen

    2014-05-01

    In this paper, using a within-subjects design, we estimate the utility weights that subjects attach to the outcome of their interaction partners in four decision situations: (1) binary Dictator Games (DG), second player's role in the sequential Prisoner's Dilemma (PD) after the first player (2) cooperated and (3) defected, and (4) first player's role in the sequential Prisoner's Dilemma game. We find that the average weights in these four decision situations have the following order: (1)>(2)>(4)>(3). Moreover, the average weight is positive in (1) but negative in (2), (3), and (4). Our findings indicate the existence of strong negative and small positive reciprocity for the average subject, but there is also high interpersonal variation in the weights in these four nodes. We conclude that the PD frame makes subjects more competitive than the DG frame. Using hierarchical Bayesian modeling, we simultaneously analyze beliefs of subjects about others' utility weights in the same four decision situations. We compare several alternative theoretical models on beliefs, e.g., rational beliefs (Bayesian-Nash equilibrium) and a consensus model. Our results on beliefs strongly support the consensus effect and refute rational beliefs: there is a strong relationship between own preferences and beliefs and this relationship is relatively stable across the four decision situations. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Hierarchical Bayesian Spatio-Temporal Analysis of Climatic and Socio-Economic Determinants of Rocky Mountain Spotted Fever.

    Directory of Open Access Journals (Sweden)

    Ram K Raghavan

    Full Text Available This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio-economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio-temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio-economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main-effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C in the region, and the relevance of these factors in the context of climate-change impacts on tick-borne diseases are discussed.

  6. Hierarchical Bayesian Spatio-Temporal Analysis of Climatic and Socio-Economic Determinants of Rocky Mountain Spotted Fever.

    Science.gov (United States)

    Raghavan, Ram K; Goodin, Douglas G; Neises, Daniel; Anderson, Gary A; Ganta, Roman R

    2016-01-01

    This study aims to examine the spatio-temporal dynamics of Rocky Mountain spotted fever (RMSF) prevalence in four contiguous states of Midwestern United States, and to determine the impact of environmental and socio-economic factors associated with this disease. Bayesian hierarchical models were used to quantify space and time only trends and spatio-temporal interaction effect in the case reports submitted to the state health departments in the region. Various socio-economic, environmental and climatic covariates screened a priori in a bivariate procedure were added to a main-effects Bayesian model in progressive steps to evaluate important drivers of RMSF space-time patterns in the region. Our results show a steady increase in RMSF incidence over the study period to newer geographic areas, and the posterior probabilities of county-specific trends indicate clustering of high risk counties in the central and southern parts of the study region. At the spatial scale of a county, the prevalence levels of RMSF is influenced by poverty status, average relative humidity, and average land surface temperature (>35°C) in the region, and the relevance of these factors in the context of climate-change impacts on tick-borne diseases are discussed.

  7. Hierarchical Bayesian Data Analysis in Radiometric SAR System Calibration: A Case Study on Transponder Calibration with RADARSAT-2 Data

    Directory of Open Access Journals (Sweden)

    Björn J. Döring

    2013-12-01

    Full Text Available A synthetic aperture radar (SAR system requires external absolute calibration so that radiometric measurements can be exploited in numerous scientific and commercial applications. Besides estimating a calibration factor, metrological standards also demand the derivation of a respective calibration uncertainty. This uncertainty is currently not systematically determined. Here for the first time it is proposed to use hierarchical modeling and Bayesian statistics as a consistent method for handling and analyzing the hierarchical data typically acquired during external calibration campaigns. Through the use of Markov chain Monte Carlo simulations, a joint posterior probability can be conveniently derived from measurement data despite the necessary grouping of data samples. The applicability of the method is demonstrated through a case study: The radar reflectivity of DLR’s new C-band Kalibri transponder is derived through a series of RADARSAT-2 acquisitions and a comparison with reference point targets (corner reflectors. The systematic derivation of calibration uncertainties is seen as an important step toward traceable radiometric calibration of synthetic aperture radars.

  8. A hierarchical Bayesian model for regionalized seasonal forecasts: Application to low flows in the northeastern United States

    Science.gov (United States)

    Ahn, Kuk-Hyun; Palmer, Richard; Steinschneider, Scott

    2017-01-01

    This study presents a regional, probabilistic framework for seasonal forecasts of extreme low summer flows in the northeastern United States conditioned on antecedent climate and hydrologic conditions. The model is developed to explore three innovations in hierarchical modeling for seasonal forecasting at ungaged sites: (1) predictive climate teleconnections are inferred directly from ocean fields instead of predefined climate indices, (2) a parsimonious modeling structure is introduced to allow climate teleconnections to vary spatially across streamflow gages, and (3) climate teleconnections and antecedent hydrologic conditions are considered jointly for regional forecast development. The proposed model is developed and calibrated in a hierarchical Bayesian framework to pool regional information across sites and enhance regionalization skill. The model is validated in a cross-validation framework along with five simpler nested formulations to test specific hypotheses embedded in the full model structure. Results indicate that each of the three innovations improve out-of-sample summer low-flow forecasts, with the greatest benefits derived from the spatially heterogeneous effect of climate teleconnections. We conclude with a discussion of possible model improvements from a better representation of antecedent hydrologic conditions at ungaged sites.

  9. A Bayesian Approach for Sensor Optimisation in Impact Identification.

    Science.gov (United States)

    Mallardo, Vincenzo; Sharif Khodaei, Zahra; Aliabadi, Ferri M H

    2016-11-22

    This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM) system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  10. A Bayesian approach to estimating the prehepatic insulin secretion rate

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    by a Bayesian approach where efficient posterior sampling is made available through the use of Markov chain Monte Carlo methods. Hereby the ill-posed estimation problem inherited in the coupled differential equation model is regularized by the use of prior knowledge. The method is demonstrated on experimental...... for the estimation of the prehepatic insulin secretion rate. We consider a stochastic differential equation model that combines both insulin and C-peptide concentrations in plasma to estimate the prehepatic insulin secretion rate. Previously this model has been analysed in an iterative deterministic set-up, where...... the time courses of insulin and C-peptide subsequently are used as known forcing functions. In this work we adopt a Bayesian graphical model to describe the unied model simultaneously. We develop a model that also accounts for both measurement error and process variability. The parameters are estimated...

  11. Hierarchical structure of biological systems: a bioengineering approach.

    Science.gov (United States)

    Alcocer-Cuarón, Carlos; Rivera, Ana L; Castaño, Victor M

    2014-01-01

    A general theory of biological systems, based on few fundamental propositions, allows a generalization of both Wierner and Berthalanffy approaches to theoretical biology. Here, a biological system is defined as a set of self-organized, differentiated elements that interact pair-wise through various networks and media, isolated from other sets by boundaries. Their relation to other systems can be described as a closed loop in a steady-state, which leads to a hierarchical structure and functioning of the biological system. Our thermodynamical approach of hierarchical character can be applied to biological systems of varying sizes through some general principles, based on the exchange of energy information and/or mass from and within the systems.

  12. Remotely Sensed Monitoring of Small Reservoir Dynamics: A Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Dirk Eilander

    2014-01-01

    Full Text Available Multipurpose small reservoirs are important for livelihoods in rural semi-arid regions. To manage and plan these reservoirs and to assess their hydrological impact at a river basin scale, it is important to monitor their water storage dynamics. This paper introduces a Bayesian approach for monitoring small reservoirs with radar satellite images. The newly developed growing Bayesian classifier has a high degree of automation, can readily be extended with auxiliary information and reduces the confusion error to the land-water boundary pixels. A case study has been performed in the Upper East Region of Ghana, based on Radarsat-2 data from November 2012 until April 2013. Results show that the growing Bayesian classifier can deal with the spatial and temporal variability in synthetic aperture radar (SAR backscatter intensities from small reservoirs. Due to its ability to incorporate auxiliary information, the algorithm is able to delineate open water from SAR imagery with a low land-water contrast in the case of wind-induced Bragg scattering or limited vegetation on the land surrounding a small reservoir.

  13. Assessment of CT image quality using a Bayesian approach

    Science.gov (United States)

    Reginatto, M.; Anton, M.; Elster, C.

    2017-08-01

    One of the most promising approaches for evaluating CT image quality is task-specific quality assessment. This involves a simplified version of a clinical task, e.g. deciding whether an image belongs to the class of images that contain the signature of a lesion or not. Task-specific quality assessment can be done by model observers, which are mathematical procedures that carry out the classification task. The most widely used figure of merit for CT image quality is the area under the ROC curve (AUC), a quantity which characterizes the performance of a given model observer. In order to estimate AUC from a finite sample of images, different approaches from classical statistics have been suggested. The goal of this paper is to introduce task-specific quality assessment of CT images to metrology and to propose a novel Bayesian estimation of AUC for the channelized Hotelling observer (CHO) applied to the task of detecting a lesion at a known image location. It is assumed that signal-present and signal-absent images follow multivariate normal distributions with the same covariance matrix. The Bayesian approach results in a posterior distribution for the AUC of the CHO which provides in addition a complete characterization of the uncertainty of this figure of merit. The approach is illustrated by its application to both simulated and experimental data.

  14. A Bayesian experimental design approach to structural health monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Flynn, Eric [UCSD; Todd, Michael [UCSD

    2010-01-01

    Optimal system design for SHM involves two primarily challenges. The first is the derivation of a proper performance function for a given system design. The second is the development of an efficient optimization algorithm for choosing a design that maximizes, or nearly maximizes the performance function. In this paper we will outline how an SHM practitioner can construct the proper performance function by casting the entire design problem into a framework of Bayesian experimental design. The approach demonstrates how the design problem necessarily ties together all steps of the SHM process.

  15. The subjectivity of scientists and the Bayesian approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  16. Linking bovine tuberculosis on cattle farms to white-tailed deer and environmental variables using Bayesian hierarchical analysis

    Science.gov (United States)

    Walter, William D.; Smith, Rick; Vanderklok, Mike; VerCauterren, Kurt C.

    2014-01-01

    Bovine tuberculosis is a bacterial disease caused by Mycobacterium bovis in livestock and wildlife with hosts that include Eurasian badgers (Meles meles), brushtail possum (Trichosurus vulpecula), and white-tailed deer (Odocoileus virginianus). Risk-assessment efforts in Michigan have been initiated on farms to minimize interactions of cattle with wildlife hosts but research onM. bovis on cattle farms has not investigated the spatial context of disease epidemiology. To incorporate spatially explicit data, initial likelihood of infection probabilities for cattle farms tested for M. bovis, prevalence of M. bovis in white-tailed deer, deer density, and environmental variables for each farm were modeled in a Bayesian hierarchical framework. We used geo-referenced locations of 762 cattle farms that have been tested for M. bovis, white-tailed deer prevalence, and several environmental variables that may lead to long-term survival and viability of M. bovis on farms and surrounding habitats (i.e., soil type, habitat type). Bayesian hierarchical analyses identified deer prevalence and proportion of sandy soil within our sampling grid as the most supported model. Analysis of cattle farms tested for M. bovisidentified that for every 1% increase in sandy soil resulted in an increase in odds of infection by 4%. Our analysis revealed that the influence of prevalence of M. bovis in white-tailed deer was still a concern even after considerable efforts to prevent cattle interactions with white-tailed deer through on-farm mitigation and reduction in the deer population. Cattle farms test positive for M. bovis annually in our study area suggesting that the potential for an environmental source either on farms or in the surrounding landscape may contributing to new or re-infections with M. bovis. Our research provides an initial assessment of potential environmental factors that could be incorporated into additional modeling efforts as more knowledge of deer herd

  17. Improving the Calibration of the SN Ia Anchor Datasets with a Bayesian Hierarchal Model

    Science.gov (United States)

    Currie, Miles; Rubin, David

    2018-01-01

    Inter-survey calibration remains one of the largest systematic uncertainties in SN Ia cosmology today. Ideally, each survey would measure their system throughputs and observe well characterized spectrophotometric standard stars, but many important surveys have not done so. For these surveys, we calibrate using tertiary survey stars tied to SDSS and Pan-STARRS. We improve on previous efforts by taking the spatially variable response of each telescope/camera into account, and using improved color transformations in the surveys’ natural instrumental photometric system. We use a global hierarchical model of the data, automatically providing a covariance matrix of magnitude offsets and bandpass shifts which reduces the systematic uncertainty in inter-survey calibration, thereby providing better cosmological constraints.

  18. A Bayesian statistics approach to multiscale coarse graining

    Science.gov (United States)

    Liu, Pu; Shi, Qiang; Daumé, Hal; Voth, Gregory A.

    2008-12-01

    Coarse-grained (CG) modeling provides a promising way to investigate many important physical and biological phenomena over large spatial and temporal scales. The multiscale coarse-graining (MS-CG) method has been proven to be a thermodynamically consistent way to systematically derive a CG model from atomistic force information, as shown in a variety of systems, ranging from simple liquids to proteins embedded in lipid bilayers. In the present work, Bayes' theorem, an advanced statistical tool widely used in signal processing and pattern recognition, is adopted to further improve the MS-CG force field obtained from the CG modeling. This approach can regularize the linear equation resulting from the underlying force-matching methodology, therefore substantially improving the quality of the MS-CG force field, especially for the regions with limited sampling. Moreover, this Bayesian approach can naturally provide an error estimation for each force field parameter, from which one can know the extent the results can be trusted. The robustness and accuracy of the Bayesian MS-CG algorithm is demonstrated for three different systems, including simple liquid methanol, polyalanine peptide solvated in explicit water, and a much more complicated peptide assembly with 32 NNQQNY hexapeptides.

  19. Manual hierarchical clustering of regional geochemical data using a Bayesian finite mixture model

    Science.gov (United States)

    Ellefsen, Karl J.; Smith, David

    2016-01-01

    Interpretation of regional scale, multivariate geochemical data is aided by a statistical technique called “clustering.” We investigate a particular clustering procedure by applying it to geochemical data collected in the State of Colorado, United States of America. The clustering procedure partitions the field samples for the entire survey area into two clusters. The field samples in each cluster are partitioned again to create two subclusters, and so on. This manual procedure generates a hierarchy of clusters, and the different levels of the hierarchy show geochemical and geological processes occurring at different spatial scales. Although there are many different clustering methods, we use Bayesian finite mixture modeling with two probability distributions, which yields two clusters. The model parameters are estimated with Hamiltonian Monte Carlo sampling of the posterior probability density function, which usually has multiple modes. Each mode has its own set of model parameters; each set is checked to ensure that it is consistent both with the data and with independent geologic knowledge. The set of model parameters that is most consistent with the independent geologic knowledge is selected for detailed interpretation and partitioning of the field samples.

  20. AzTEC Survey of the Central Molecular Zone: Modeling Dust SEDs and N-PDF with Hierarchical Bayesian Analysis

    Science.gov (United States)

    Tang, Yuping; Wang, Daniel; Wilson, Grant; Gutermuth, Robert; Heyer, Mark

    2018-01-01

    We present the AzTEC/LMT survey of dust continuum at 1.1mm on the central ˜ 200pc (CMZ) of our Galaxy. A joint SED analysis of all existing dust continuum surveys on the CMZ is performed, from 160µm to 1.1mm. Our analysis follows a MCMC sampling strategy incorporating the knowledge of PSFs in different maps, which provides unprecedented spacial resolution on distributions of dust temperature, column density and emissivity index. The dense clumps in the CMZ typically show low dust temperature ( 20K), with no significant sign of buried star formation, and a weak evolution of higher emissivity index toward dense peak. A new model is proposed, allowing for varying dust temperature inside a cloud and self-shielding of dust emission, which leads to similar conclusions on dust temperature and grain properties. We further apply a hierarchical Bayesian analysis to infer the column density probability distribution function (N-PDF), while simultaneously removing the Galactic foreground and background emission. The N-PDF shows a steep power-law profile with α > 3, indicating that formation of dense structures are suppressed.

  1. Electroencephalography-based real-time cortical monitoring system that uses hierarchical Bayesian estimations for the brain-machine interface.

    Science.gov (United States)

    Choi, Kyuwan

    2014-06-01

    In this study, a real-time cortical activity monitoring system was constructed, which could estimate cortical activities every 125 milliseconds over 2,240 vertexes from 64 channel electroencephalography signals through the Hierarchical Bayesian estimation that uses functional magnetic resonance imaging data as its prior information. Recently, functional magnetic resonance imaging has mostly been used in the neurofeedback field because it allows for high spatial resolution. However, in functional magnetic resonance imaging, the time for the neurofeedback information to reach the patient is delayed several seconds because of its poor temporal resolution. Therefore, a number of problems need to be solved to effectively implement feedback training paradigms in patients. To address this issue, this study used a new cortical activity monitoring system that improved both spatial and temporal resolution by using both functional magnetic resonance imaging data and electroencephalography signals in conjunction with one another. This system is advantageous as it can improve applications in the fields of real-time diagnosis, neurofeedback, and the brain-machine interface.

  2. Comparison of hierarchical Bayesian models for overdispersed count data using DIC and Bayes' factors.

    Science.gov (United States)

    Millar, Russell B

    2009-09-01

    When replicate count data are overdispersed, it is common practice to incorporate this extra-Poisson variability by including latent parameters at the observation level. For example, the negative binomial and Poisson-lognormal (PLN) models are obtained by using gamma and lognormal latent parameters, respectively. Several recent publications have employed the deviance information criterion (DIC) to choose between these two models, with the deviance defined using the Poisson likelihood that is obtained from conditioning on these latent parameters. The results herein show that this use of DIC is inappropriate. Instead, DIC was seen to perform well if calculated using likelihood that was marginalized at the group level by integrating out the observation-level latent parameters. This group-level marginalization is explicit in the case of the negative binomial, but requires numerical integration for the PLN model. Similarly, DIC performed well to judge whether zero inflation was required when calculated using the group-marginalized form of the zero-inflated likelihood. In the context of comparing multilevel hierarchical models, the top-level DIC was obtained using likelihood that was further marginalized by additional integration over the group-level latent parameters, and the marginal densities of the models were calculated for the purpose of providing Bayes' factors. The computational viability and interpretability of these different measures is considered.

  3. Inventory control of spare parts using a Bayesian approach: a case study

    OpenAIRE

    Aronis, K-P.; Magou, I.; Dekker, R.; Tagaras, G.

    1999-01-01

    textabstractThis paper presents a case study of applying a Bayesian approach to forecast demand and subsequently determine the appropriate parameter S of an (S-1,S) inventory system for controlling spare parts of electronic equipment. First, the problem and the current policy are described. Then, the basic elements of the Bayesian approach are introduced and the procedure for calculating the appropriate parameter S is illustrated. Finally, we present the results of applying the Bayesian appro...

  4. A Bayesian Approach for Sensor Optimisation in Impact Identification

    Directory of Open Access Journals (Sweden)

    Vincenzo Mallardo

    2016-11-01

    Full Text Available This paper presents a Bayesian approach for optimizing the position of sensors aimed at impact identification in composite structures under operational conditions. The uncertainty in the sensor data has been represented by statistical distributions of the recorded signals. An optimisation strategy based on the genetic algorithm is proposed to find the best sensor combination aimed at locating impacts on composite structures. A Bayesian-based objective function is adopted in the optimisation procedure as an indicator of the performance of meta-models developed for different sensor combinations to locate various impact events. To represent a real structure under operational load and to increase the reliability of the Structural Health Monitoring (SHM system, the probability of malfunctioning sensors is included in the optimisation. The reliability and the robustness of the procedure is tested with experimental and numerical examples. Finally, the proposed optimisation algorithm is applied to a composite stiffened panel for both the uniform and non-uniform probability of impact occurrence.

  5. A hierarchical approach to reducing communication in parallel graph algorithms

    KAUST Repository

    Harshvardhan,

    2015-01-01

    Large-scale graph computing has become critical due to the ever-increasing size of data. However, distributed graph computations are limited in their scalability and performance due to the heavy communication inherent in such computations. This is exacerbated in scale-free networks, such as social and web graphs, which contain hub vertices that have large degrees and therefore send a large number of messages over the network. Furthermore, many graph algorithms and computations send the same data to each of the neighbors of a vertex. Our proposed approach recognizes this, and reduces communication performed by the algorithm without change to user-code, through a hierarchical machine model imposed upon the input graph. The hierarchical model takes advantage of locale information of the neighboring vertices to reduce communication, both in message volume and total number of bytes sent. It is also able to better exploit the machine hierarchy to further reduce the communication costs, by aggregating traffic between different levels of the machine hierarchy. Results of an implementation in the STAPL GL shows improved scalability and performance over the traditional level-synchronous approach, with 2.5 × - 8× improvement for a variety of graph algorithms at 12, 000+ cores.

  6. A Bayesian hierarchical model for prediction of latent health states from multiple data sources with application to active surveillance of prostate cancer.

    Science.gov (United States)

    Coley, Rebecca Yates; Fisher, Aaron J; Mamawala, Mufaddal; Carter, Herbert Ballentine; Pienta, Kenneth J; Zeger, Scott L

    2017-06-01

    In this article, we present a Bayesian hierarchical model for predicting a latent health state from longitudinal clinical measurements. Model development is motivated by the need to integrate multiple sources of data to improve clinical decisions about whether to remove or irradiate a patient's prostate cancer. Existing modeling approaches are extended to accommodate measurement error in cancer state determinations based on biopsied tissue, clinical measurements possibly not missing at random, and informative partial observation of the true state. The proposed model enables estimation of whether an individual's underlying prostate cancer is aggressive, requiring surgery and/or radiation, or indolent, permitting continued surveillance. These individualized predictions can then be communicated to clinicians and patients to inform decision-making. We demonstrate the model with data from a cohort of low-risk prostate cancer patients at Johns Hopkins University and assess predictive accuracy among a subset for whom true cancer state is observed. Simulation studies confirm model performance and explore the impact of adjusting for informative missingness on true state predictions. R code is provided in an online supplement and at http://github.com/rycoley/prediction-prostate-surveillance. © 2016, The International Biometric Society.

  7. A Bayesian Approach to Real-Time Earthquake Phase Association

    Science.gov (United States)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  8. An evaluation of the Bayesian approach to fitting the N-mixture model for use with pseudo-replicated count data

    Science.gov (United States)

    Toribo, S.G.; Gray, B.R.; Liang, S.

    2011-01-01

    The N-mixture model proposed by Royle in 2004 may be used to approximate the abundance and detection probability of animal species in a given region. In 2006, Royle and Dorazio discussed the advantages of using a Bayesian approach in modelling animal abundance and occurrence using a hierarchical N-mixture model. N-mixture models assume replication on sampling sites, an assumption that may be violated when the site is not closed to changes in abundance during the survey period or when nominal replicates are defined spatially. In this paper, we studied the robustness of a Bayesian approach to fitting the N-mixture model for pseudo-replicated count data. Our simulation results showed that the Bayesian estimates for abundance and detection probability are slightly biased when the actual detection probability is small and are sensitive to the presence of extra variability within local sites.

  9. Spatial variability of the effect of air pollution on term birth weight: evaluating influential factors using Bayesian hierarchical models.

    Science.gov (United States)

    Li, Lianfa; Laurent, Olivier; Wu, Jun

    2016-02-05

    Epidemiological studies suggest that air pollution is adversely associated with pregnancy outcomes. Such associations may be modified by spatially-varying factors including socio-demographic characteristics, land-use patterns and unaccounted exposures. Yet, few studies have systematically investigated the impact of these factors on spatial variability of the air pollution's effects. This study aimed to examine spatial variability of the effects of air pollution on term birth weight across Census tracts and the influence of tract-level factors on such variability. We obtained over 900,000 birth records from 2001 to 2008 in Los Angeles County, California, USA. Air pollution exposure was modeled at individual level for nitrogen dioxide (NO2) and nitrogen oxides (NOx) using spatiotemporal models. Two-stage Bayesian hierarchical non-linear models were developed to (1) quantify the associations between air pollution exposure and term birth weight within each tract; and (2) examine the socio-demographic, land-use, and exposure-related factors contributing to the between-tract variability of the associations between air pollution and term birth weight. Higher air pollution exposure was associated with lower term birth weight (average posterior effects: -14.7 (95 % CI: -19.8, -9.7) g per 10 ppb increment in NO2 and -6.9 (95 % CI: -12.9, -0.9) g per 10 ppb increment in NOx). The variation of the association across Census tracts was significantly influenced by the tract-level socio-demographic, exposure-related and land-use factors. Our models captured the complex non-linear relationship between these factors and the associations between air pollution and term birth weight: we observed the thresholds from which the influence of the tract-level factors was markedly exacerbated or attenuated. Exacerbating factors might reflect additional exposure to environmental insults or lower socio-economic status with higher vulnerability, whereas attenuating factors might indicate reduced

  10. A Robust Obstacle Avoidance for Service Robot Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Widodo Budiharto

    2011-03-01

    Full Text Available The objective of this paper is to propose a robust obstacle avoidance method for service robot in indoor environment. The method for obstacles avoidance uses information about static obstacles on the landmark using edge detection. Speed and direction of people that walks as moving obstacle obtained by single camera using tracking and recognition system and distance measurement using 3 ultrasonic sensors. A new geometrical model and maneuvering method for moving obstacle avoidance introduced and combined with Bayesian approach for state estimation. The obstacle avoidance problem is formulated using decision theory, prior and posterior distribution and loss function to determine an optimal response based on inaccurate sensor data. Algorithms for moving obstacles avoidance method proposed and experiment results implemented to service robot also presented. Various experiments show that our proposed method very fast, robust and successfully implemented to service robot called Srikandi II that equipped with 4 DOF arm robot developed in our laboratory.

  11. A full bayesian approach for boolean genetic network inference.

    Directory of Open Access Journals (Sweden)

    Shengtong Han

    Full Text Available Boolean networks are a simple but efficient model for describing gene regulatory systems. A number of algorithms have been proposed to infer Boolean networks. However, these methods do not take full consideration of the effects of noise and model uncertainty. In this paper, we propose a full Bayesian approach to infer Boolean genetic networks. Markov chain Monte Carlo algorithms are used to obtain the posterior samples of both the network structure and the related parameters. In addition to regular link addition and removal moves, which can guarantee the irreducibility of the Markov chain for traversing the whole network space, carefully constructed mixture proposals are used to improve the Markov chain Monte Carlo convergence. Both simulations and a real application on cell-cycle data show that our method is more powerful than existing methods for the inference of both the topology and logic relations of the Boolean network from observed data.

  12. A Hierarchical Modeling Approach to Data Analysis and Study Design in a Multi-Site Experimental fMRI Study

    Science.gov (United States)

    Zhou, Bo; Konstorum, Anna; Duong, Thao; Tieu, Kinh H.; Wells, William M.; Brown, Gregory G.; Stern, Hal S.; Shahbaba, Babak

    2013-01-01

    We propose a hierarchical Bayesian model for analyzing multi-site experimental fMRI studies. Our method takes the hierarchical structure of the data (subjects are nested within sites, and there are multiple observations per subject) into account and allows for modeling between-site variation. Using posterior predictive model checking and model…

  13. Hierarchical Bayesian random intercept model-based cross-level interaction decomposition for truck driver injury severity investigations.

    Science.gov (United States)

    Chen, Cong; Zhang, Guohui; Tian, Zong; Bogus, Susan M; Yang, Yin

    2015-12-01

    Traffic crashes occurring on rural roadways induce more severe injuries and fatalities than those in urban areas, especially when there are trucks involved. Truck drivers are found to suffer higher potential of crash injuries compared with other occupational labors. Besides, unobserved heterogeneity in crash data analysis is a critical issue that needs to be carefully addressed. In this study, a hierarchical Bayesian random intercept model decomposing cross-level interaction effects as unobserved heterogeneity is developed to examine the posterior probabilities of truck driver injuries in rural truck-involved crashes. The interaction effects contributing to truck driver injury outcomes are investigated based on two-year rural truck-involved crashes in New Mexico from 2010 to 2011. The analysis results indicate that the cross-level interaction effects play an important role in predicting truck driver injury severities, and the proposed model produces comparable performance with the traditional random intercept model and the mixed logit model even after penalization by high model complexity. It is revealed that factors including road grade, number of vehicles involved in a crash, maximum vehicle damage in a crash, vehicle actions, driver age, seatbelt use, and driver under alcohol or drug influence, as well as a portion of their cross-level interaction effects with other variables are significantly associated with truck driver incapacitating injuries and fatalities. These findings are helpful to understand the respective or joint impacts of these attributes on truck driver injury patterns in rural truck-involved crashes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Effects of management intervention on post-disturbance community composition: an experimental analysis using bayesian hierarchical models.

    Directory of Open Access Journals (Sweden)

    Jack Giovanini

    Full Text Available As human demand for ecosystem products increases, management intervention may become more frequent after environmental disturbances. Evaluations of ecological responses to cumulative effects of management interventions and natural disturbances provide critical decision-support tools for managers who strive to balance environmental conservation and economic development. We conducted an experiment to evaluate the effects of salvage logging on avian community composition in lodgepole pine (Pinus contorta forests affected by beetle outbreaks in Oregon, USA, 1996-1998. Treatments consisted of the removal of lodgepole pine snags only, and live trees were not harvested. We used a bayesian hierarchical model to quantify occupancy dynamics for 27 breeding species, while accounting for variation in the detection process. We examined how magnitude and precision of treatment effects varied when incorporating prior information from a separate intervention study that occurred in a similar ecological system. Regardless of which prior we evaluated, we found no evidence that the harvest treatment had a negative impact on species richness, with an estimated average of 0.2-2.2 more species in harvested stands than unharvested stands. Estimated average similarity between control and treatment stands ranged from 0.82-0.87 (1 indicating complete similarity between a pair of stands and suggested that treatment stands did not contain novel assemblies of species responding to the harvesting prescription. Estimated treatment effects were positive for twenty-four (90% of the species, although the credible intervals contained 0 in all cases. These results suggest that, unlike most post-fire salvage logging prescriptions, selective harvesting after beetle outbreaks may meet multiple management objectives, including the maintenance of avian community richness comparable to what is found in unharvested stands. Our results provide managers with prescription alternatives to

  15. The Type Ia Supernova Color-Magnitude Relation and Host Galaxy Dust: A Simple Hierarchical Bayesian Model

    Science.gov (United States)

    Mandel, Kaisey S.; Scolnic, Daniel M.; Shariff, Hikmatali; Foley, Ryan J.; Kirshner, Robert P.

    2017-06-01

    Conventional Type Ia supernova (SN Ia) cosmology analyses currently use a simplistic linear regression of magnitude versus color and light curve shape, which does not model intrinsic SN Ia variations and host galaxy dust as physically distinct effects, resulting in low color-magnitude slopes. We construct a probabilistic generative model for the dusty distribution of extinguished absolute magnitudes and apparent colors as the convolution of an intrinsic SN Ia color-magnitude distribution and a host galaxy dust reddening-extinction distribution. If the intrinsic color-magnitude (M B versus B - V) slope {β }{int} differs from the host galaxy dust law R B , this convolution results in a specific curve of mean extinguished absolute magnitude versus apparent color. The derivative of this curve smoothly transitions from {β }{int} in the blue tail to R B in the red tail of the apparent color distribution. The conventional linear fit approximates this effective curve near the average apparent color, resulting in an apparent slope {β }{app} between {β }{int} and R B . We incorporate these effects into a hierarchical Bayesian statistical model for SN Ia light curve measurements, and analyze a data set of SALT2 optical light curve fits of 248 nearby SNe Ia at z< 0.10. The conventional linear fit gives {β }{app}≈ 3. Our model finds {β }{int}=2.3+/- 0.3 and a distinct dust law of {R}B=3.8+/- 0.3, consistent with the average for Milky Way dust, while correcting a systematic distance bias of ˜0.10 mag in the tails of the apparent color distribution. Finally, we extend our model to examine the SN Ia luminosity-host mass dependence in terms of intrinsic and dust components.

  16. Using hierarchical Bayesian binary probit models to analyze crash injury severity on high speed facilities with real-time traffic data.

    Science.gov (United States)

    Yu, Rongjie; Abdel-Aty, Mohamed

    2014-01-01

    Severe crashes are causing serious social and economic loss, and because of this, reducing crash injury severity has become one of the key objectives of the high speed facilities' (freeway and expressway) management. Traditional crash injury severity analysis utilized data mainly from crash reports concerning the crash occurrence information, drivers' characteristics and roadway geometric related variables. In this study, real-time traffic and weather data were introduced to analyze the crash injury severity. The space mean speeds captured by the Automatic Vehicle Identification (AVI) system on the two roadways were used as explanatory variables in this study; and data from a mountainous freeway (I-70 in Colorado) and an urban expressway (State Road 408 in Orlando) have been used to identify the analysis result's consistence. Binary probit (BP) models were estimated to classify the non-severe (property damage only) crashes and severe (injury and fatality) crashes. Firstly, Bayesian BP models' results were compared to the results from Maximum Likelihood Estimation BP models and it was concluded that Bayesian inference was superior with more significant variables. Then different levels of hierarchical Bayesian BP models were developed with random effects accounting for the unobserved heterogeneity at segment level and crash individual level, respectively. Modeling results from both studied locations demonstrate that large variations of speed prior to the crash occurrence would increase the likelihood of severe crash occurrence. Moreover, with considering unobserved heterogeneity in the Bayesian BP models, the model goodness-of-fit has improved substantially. Finally, possible future applications of the model results and the hierarchical Bayesian probit models were discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. A hierarchical Bayesian spatio-temporal model for extreme precipitation events

    KAUST Repository

    Ghosh, Souparno

    2011-03-01

    We propose a new approach to model a sequence of spatially distributed time series of extreme values. Unlike common practice, we incorporate spatial dependence directly in the likelihood and allow the temporal component to be captured at the second level of hierarchy. Inferences about the parameters and spatio-temporal predictions are obtained via MCMC technique. The model is fitted to a gridded precipitation data set collected over 99 years across the continental U.S. © 2010 John Wiley & Sons, Ltd..

  18. A hierarchical approach for speech-instrumental-song classification.

    Science.gov (United States)

    Ghosal, Arijit; Chakraborty, Rudrasis; Dhara, Bibhas Chandra; Saha, Sanjoy Kumar

    2013-01-01

    Audio classification acts as the fundamental step for lots of applications like content based audio retrieval and audio indexing. In this work, we have presented a novel scheme for classifying audio signal into three categories namely, speech, music without voice (instrumental) and music with voice (song). A hierarchical approach has been adopted to classify the signals. At the first stage, signals are categorized as speech and music using audio texture derived from simple features like ZCR and STE. Proposed audio texture captures contextual information and summarizes the frame level features. At the second stage, music is further classified as instrumental/song based on Mel frequency cepstral co-efficient (MFCC). A classifier based on Random Sample and Consensus (RANSAC), capable of handling wide variety of data has been utilized. Experimental result indicates the effectiveness of the proposed scheme.

  19. Global Crop Monitoring: A Satellite-Based Hierarchical Approach

    Directory of Open Access Journals (Sweden)

    Bingfang Wu

    2015-04-01

    Full Text Available Taking advantage of multiple new remote sensing data sources, especially from Chinese satellites, the CropWatch system has expanded the scope of its international analyses through the development of new indicators and an upgraded operational methodology. The approach adopts a hierarchical system covering four spatial levels of detail: global, regional, national (thirty-one key countries including China and “sub-countries” (for the nine largest countries. The thirty-one countries encompass more that 80% of both production and exports of maize, rice, soybean and wheat. The methodology resorts to climatic and remote sensing indicators at different scales. The global patterns of crop environmental growing conditions are first analyzed with indicators for rainfall, temperature, photosynthetically active radiation (PAR as well as potential biomass. At the regional scale, the indicators pay more attention to crops and include Vegetation Health Index (VHI, Vegetation Condition Index (VCI, Cropped Arable Land Fraction (CALF as well as Cropping Intensity (CI. Together, they characterize crop situation, farming intensity and stress. CropWatch carries out detailed crop condition analyses at the national scale with a comprehensive array of variables and indicators. The Normalized Difference Vegetation Index (NDVI, cropped areas and crop conditions are integrated to derive food production estimates. For the nine largest countries, CropWatch zooms into the sub-national units to acquire detailed information on crop condition and production by including new indicators (e.g., Crop type proportion. Based on trend analysis, CropWatch also issues crop production supply outlooks, covering both long-term variations and short-term dynamic changes in key food exporters and importers. The hierarchical approach adopted by CropWatch is the basis of the analyses of climatic and crop conditions assessments published in the quarterly “CropWatch bulletin” which

  20. An Intelligent Hierarchical Approach to Actuator Fault Diagnosis and Accommodation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal presents a novel intelligent hierarchical approach to detection, isolation, and accommodation of primary aerodynamic actuator failures. The proposed...

  1. bayesQR: A Bayesian Approach to Quantile Regression

    Directory of Open Access Journals (Sweden)

    Dries F. Benoit

    2017-01-01

    Full Text Available After its introduction by Koenker and Basset (1978, quantile regression has become an important and popular tool to investigate the conditional response distribution in regression. The R package bayesQR contains a number of routines to estimate quantile regression parameters using a Bayesian approach based on the asymmetric Laplace distribution. The package contains functions for the typical quantile regression with continuous dependent variable, but also supports quantile regression for binary dependent variables. For both types of dependent variables, an approach to variable selection using the adaptive lasso approach is provided. For the binary quantile regression model, the package also contains a routine that calculates the fitted probabilities for each vector of predictors. In addition, functions for summarizing the results, creating traceplots, posterior histograms and drawing quantile plots are included. This paper starts with a brief overview of the theoretical background of the models used in the bayesQR package. The main part of this paper discusses the computational problems that arise in the implementation of the procedure and illustrates the usefulness of the package through selected examples.

  2. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  3. Hierarchical Bayesian mixture modelling for antigen-specific T-cell subtyping in combinatorially encoded flow cytometry studies

    DEFF Research Database (Denmark)

    Lin, Lin; Chan, Cliburn; Hadrup, Sine R

    2013-01-01

    Novel uses of automated flow cytometry technology for measuring levels of protein markers on thousands to millions of cells are promoting increasing need for relevant, customized Bayesian mixture modelling approaches in many areas of biomedical research and application. In studies of immune...... in the ability to characterize variation in immune responses involving larger numbers of functionally differentiated cell subtypes. We describe novel classes of Markov chain Monte Carlo methods for model fitting that exploit distributed GPU (graphics processing unit) implementation. We discuss issues of cellular...... subtype identification in this novel, general model framework, and provide a detailed example using simulated data. We then describe application to a data set from an experimental study of antigen-specific T-cell subtyping using combinatorially encoded assays in human blood samples. Summary comments...

  4. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    This paper considers the application of Bayesian probabilistic networks (BPNs) to large-scale risk based decision making in regard to earthquake risks. A recently developed risk management framework is outlined which utilises Bayesian probabilistic modelling, generic indicator based risk models...

  5. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  6. A Bayesian approach to linear inverse problems in seismic tomography

    Science.gov (United States)

    Tian, Y.; Zhou, Y.; Chung, J.; Chung, M.; Ning, J.

    2014-12-01

    Seismic tomography is often an ill-posed linear inverse problem and regularization such as damping and smoothing has been widely applied to find an approximate solution to the inverse problem. The "optimal" solution is chosen based on the tradeoff between model norm (or model roughness) and data misfit. The main difficulty associated with this deterministic approach is in determining a balance between model uncertainty and data fit. This can make interpretation of tomographic structures subjective because models at the "corner" of the tradeoff curve often show large variability. In this study, we investigate a Bayesian approach to the linear inverse problem by minimizing an empirical Bayes risk function based on training dataset generated for the tomographic problem. We show that sample average approximation can be used to find optimal spectral filters to solve the linear tomographic problem based on singular value decomposition (SVD). We compare optimal truncated SVD, optimal Tikhonov filtering as well as independent optimal spectral filtering in finite-frequency tomography and ray theoretical tomography using a global dataset of surface-wave dispersion measurements.

  7. Costal vulnerability systems-network using Fuzzy and Bayesian approaches

    Science.gov (United States)

    Taramelli, A.; Valentini, E.; Filipponi, F.; Nguyen Xuan, A.; Arosio, M.

    2016-12-01

    Marine drivers such as surge in the context of SLR, are threatening low-lying coastal plains. In order to deal with disturbances a deeper understanding of benefits deriving from ecosystem services assesment, management and planning (e.g. the role of dune ridges in surge mitigation and climate adaptation) can enhance the resilience of coastal systems. In this frame assessing the vulnerability is a key concern of many SOS (social, ecological, institutional) that deals with several challenges like the definition of Essential Variables (EVs) able to synthesize the required information, the assignment of different weight to be attributed to each considered variable, the selection of method for combining the relevant variables, etc.. To this end it is unclear how SLR, subsidence and erosion might affect coastal subsistence resources because of highly complex interactions and because of the subjective system of weighting many variables and their interaction within the systems. In this contribution, making the best use of many EO products, in situ data and modelling, we propose a multidimensional surge vulnerability assessment that aims at combining together geophysical and socioeconomic variable on the base of different approaches: 1) Fuzzy Logic; 2) Bayesian approach. The final goal is providing insight in understanding how to quantify regulating ecosystem services.

  8. Joint Modeling of Multiple Crimes: A Bayesian Spatial Approach

    Directory of Open Access Journals (Sweden)

    Hongqiang Liu

    2017-01-01

    Full Text Available A multivariate Bayesian spatial modeling approach was used to jointly model the counts of two types of crime, i.e., burglary and non-motor vehicle theft, and explore the geographic pattern of crime risks and relevant risk factors. In contrast to the univariate model, which assumes independence across outcomes, the multivariate approach takes into account potential correlations between crimes. Six independent variables are included in the model as potential risk factors. In order to fully present this method, both the multivariate model and its univariate counterpart are examined. We fitted the two models to the data and assessed them using the deviance information criterion. A comparison of the results from the two models indicates that the multivariate model was superior to the univariate model. Our results show that population density and bar density are clearly associated with both burglary and non-motor vehicle theft risks and indicate a close relationship between these two types of crime. The posterior means and 2.5% percentile of type-specific crime risks estimated by the multivariate model were mapped to uncover the geographic patterns. The implications, limitations and future work of the study are discussed in the concluding section.

  9. Modelling of population dynamics of red king crab using Bayesian approach

    Directory of Open Access Journals (Sweden)

    Bakanev Sergey ...

    2012-10-01

    Modeling population dynamics based on the Bayesian approach enables to successfully resolve the above issues. The integration of the data from various studies into a unified model based on Bayesian parameter estimation method provides a much more detailed description of the processes occurring in the population.

  10. An information theoretic approach to verification of modular Bayesian fusion systems

    NARCIS (Netherlands)

    de Oude, P.; Pavlin, G.

    2008-01-01

    This paper introduces an information theoretic approach to verification of causal models in modular Bayesian fusion systems. We assume distributed fusion systems which are gradually extended by adding new modules, each having a limited domain knowledge captured in local Bayesian networks. However,

  11. A Bayesian approach to extracting meaning from system behavior

    Energy Technology Data Exchange (ETDEWEB)

    Dress, W.B.

    1998-08-01

    The modeling relation and its reformulation to include the semiotic hierarchy is essential for the understanding, control, and successful re-creation of natural systems. This presentation will argue for a careful application of Rosen`s modeling relationship to the problems of intelligence and autonomy in natural and artificial systems. To this end, the authors discuss the essential need for a correct theory of induction, learning, and probability; and suggest that modern Bayesian probability theory, developed by Cox, Jaynes, and others, can adequately meet such demands, especially on the operational level of extracting meaning from observations. The methods of Bayesian and maximum Entropy parameter estimation have been applied to measurements of system observables to directly infer the underlying differential equations generating system behavior. This approach by-passes the usual method of parameter estimation based on assuming a functional form for the observable and then estimating the parameters that would lead to the particular observed behavior. The computational savings is great since only location parameters enter into the maximum-entropy calculations; this innovation finesses the need for nonlinear parameters altogether. Such an approach more directly extracts the semantics inherent in a given system by going to the root of system meaning as expressed by abstract form or shape, rather than in syntactic particulars, such as signal amplitude and phase. Examples will be shown how the form of a system can be followed while ignoring unnecessary details. In this sense, the authors are observing the meaning of the words rather than being concerned with their particular expression or language. For the present discussion, empirical models are embodied by the differential equations underlying, producing, or describing the behavior of a process as measured or tracked by a particular variable set--the observables. The a priori models are probability structures that

  12. Hierarchical Approaches to the Analysis of Genetic Diversity in ...

    African Journals Online (AJOL)

    Hierarchical analysis highlights the nature of relationship between and among type samples as outlined by standard descriptors. It produces an output called dendrogram, which depicts the hierarchical structure of genetic interaction in clusters/groups. Genetic diversity is the variation of heritable characteristics in a ...

  13. Hierarchical polynomial network approach to automated target recognition

    Science.gov (United States)

    Kim, Richard Y.; Drake, Keith C.; Kim, Tony Y.

    1994-02-01

    A hierarchical recognition methodology using abductive networks at several levels of object recognition is presented. Abductive networks--an innovative numeric modeling technology using networks of polynomial nodes--results from nearly three decades of application research and development in areas including statistical modeling, uncertainty management, genetic algorithms, and traditional neural networks. The systems uses pixel-registered multisensor target imagery provided by the Tri-Service Laser Radar sensor. Several levels of recognition are performed using detection, classification, and identification, each providing more detailed object information. Advanced feature extraction algorithms are applied at each recognition level for target characterization. Abductive polynomial networks process feature information and situational data at each recognition level, providing input for the next level of processing. An expert system coordinates the activities of individual recognition modules and enables employment of heuristic knowledge to overcome the limitations provided by a purely numeric processing approach. The approach can potentially overcome limitations of current systems such as catastrophic degradation during unanticipated operating conditions while meeting strict processing requirements. These benefits result from implementation of robust feature extraction algorithms that do not take explicit advantage of peculiar characteristics of the sensor imagery, and the compact, real-time processing capability provided by abductive polynomial networks.

  14. Applied Bayesian hierarchical methods

    National Research Council Canada - National Science Library

    Congdon, P

    2010-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Posterior Inference from Bayes Formula . . . . . . . . . . . . 1.3 Markov Chain Monte Carlo Sampling in Relation to Monte Carlo Methods: Obtaining Posterior...

  15. Bayesian adjustment for covariate measurement errors: a flexible parametric approach.

    Science.gov (United States)

    Hossain, Shahadut; Gustafson, Paul

    2009-05-15

    In most epidemiological investigations, the study units are people, the outcome variable (or the response) is a health-related event, and the explanatory variables are usually environmental and/or socio-demographic factors. The fundamental task in such investigations is to quantify the association between the explanatory variables (covariates/exposures) and the outcome variable through a suitable regression model. The accuracy of such quantification depends on how precisely the relevant covariates are measured. In many instances, we cannot measure some of the covariates accurately. Rather, we can measure noisy (mismeasured) versions of them. In statistical terminology, mismeasurement in continuous covariates is known as measurement errors or errors-in-variables. Regression analyses based on mismeasured covariates lead to biased inference about the true underlying response-covariate associations. In this paper, we suggest a flexible parametric approach for avoiding this bias when estimating the response-covariate relationship through a logistic regression model. More specifically, we consider the flexible generalized skew-normal and the flexible generalized skew-t distributions for modeling the unobserved true exposure. For inference and computational purposes, we use Bayesian Markov chain Monte Carlo techniques. We investigate the performance of the proposed flexible parametric approach in comparison with a common flexible parametric approach through extensive simulation studies. We also compare the proposed method with the competing flexible parametric method on a real-life data set. Though emphasis is put on the logistic regression model, the proposed method is unified and is applicable to the other generalized linear models, and to other types of non-linear regression models as well. (c) 2009 John Wiley & Sons, Ltd.

  16. A Bayesian decision approach to rainfall thresholds based flood warning

    Directory of Open Access Journals (Sweden)

    M. L. V. Martina

    2006-01-01

    Full Text Available Operational real time flood forecasting systems generally require a hydrological model to run in real time as well as a series of hydro-informatics tools to transform the flood forecast into relatively simple and clear messages to the decision makers involved in flood defense. The scope of this paper is to set forth the possibility of providing flood warnings at given river sections based on the direct comparison of the quantitative precipitation forecast with critical rainfall threshold values, without the need of an on-line real time forecasting system. This approach leads to an extremely simplified alert system to be used by non technical stakeholders and could also be used to supplement the traditional flood forecasting systems in case of system failures. The critical rainfall threshold values, incorporating the soil moisture initial conditions, result from statistical analyses using long hydrological time series combined with a Bayesian utility function minimization. In the paper, results of an application of the proposed methodology to the Sieve river, a tributary of the Arno river in Italy, are given to exemplify its practical applicability.

  17. A Bayesian Approach for Structural Learning with Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Cen Li

    2002-01-01

    Full Text Available Hidden Markov Models(HMM have proved to be a successful modeling paradigm for dynamic and spatial processes in many domains, such as speech recognition, genomics, and general sequence alignment. Typically, in these applications, the model structures are predefined by domain experts. Therefore, the HMM learning problem focuses on the learning of the parameter values of the model to fit the given data sequences. However, when one considers other domains, such as, economics and physiology, model structure capturing the system dynamic behavior is not available. In order to successfully apply the HMM methodology in these domains, it is important that a mechanism is available for automatically deriving the model structure from the data. This paper presents a HMM learning procedure that simultaneously learns the model structure and the maximum likelihood parameter values of a HMM from data. The HMM model structures are derived based on the Bayesian model selection methodology. In addition, we introduce a new initialization procedure for HMM parameter value estimation based on the K-means clustering method. Experimental results with artificially generated data show the effectiveness of the approach.

  18. Defining statistical perceptions with an empirical Bayesian approach

    Science.gov (United States)

    Tajima, Satohiro

    2013-04-01

    Extracting statistical structures (including textures or contrasts) from a natural stimulus is a central challenge in both biological and engineering contexts. This study interprets the process of statistical recognition in terms of hyperparameter estimations and free-energy minimization procedures with an empirical Bayesian approach. This mathematical interpretation resulted in a framework for relating physiological insights in animal sensory systems to the functional properties of recognizing stimulus statistics. We applied the present theoretical framework to two typical models of natural images that are encoded by a population of simulated retinal neurons, and demonstrated that the resulting cognitive performances could be quantified with the Fisher information measure. The current enterprise yielded predictions about the properties of human texture perception, suggesting that the perceptual resolution of image statistics depends on visual field angles, internal noise, and neuronal information processing pathways, such as the magnocellular, parvocellular, and koniocellular systems. Furthermore, the two conceptually similar natural-image models were found to yield qualitatively different predictions, striking a note of warning against confusing the two models when describing a natural image.

  19. Bayesian approach to target tracking in the presence of glint

    Science.gov (United States)

    Gordon, Neil J.; Whitby, Angela

    1995-09-01

    When tracking targets with radar, changes in target aspect with respect to the observer can cause the apparent center of radar reflections to wander significantly. The resulting noisy angle errors are called target glint. Glint may severely affect the tracking accuracy, particularly when tracking large targets at short ranges (such as might occur in the final homing phase of a missile engagement). The effect of glint is to produce heavy-tailed, time correlated non-Gaussian disturbances on the observations. It is well known that the performance of the Kalman filter degrades severely in the presence of such disturbances. In this paper we propose a random sample based implementation of a Bayesian recursive filter. This filter is based on the Metropolis-Hastings algorithm and the Gaussian sum approach. The key advantage of the filter is that any nonlinear/non-Gaussian system and/or measurement models can be routinely implemented. Tracking performance of the filter is demonstrated in the presence of glint.

  20. Parameter Estimation of Structural Equation Modeling Using Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Dewi Kurnia Sari

    2016-05-01

    Full Text Available Leadership is a process of influencing, directing or giving an example of employees in order to achieve the objectives of the organization and is a key element in the effectiveness of the organization. In addition to the style of leadership, the success of an organization or company in achieving its objectives can also be influenced by the commitment of the organization. Where organizational commitment is a commitment created by each individual for the betterment of the organization. The purpose of this research is to obtain a model of leadership style and organizational commitment to job satisfaction and employee performance, and determine the factors that influence job satisfaction and employee performance using SEM with Bayesian approach. This research was conducted at Statistics FNI employees in Malang, with 15 people. The result of this study showed that the measurement model, all significant indicators measure each latent variable. Meanwhile in the structural model, it was concluded there are a significant difference between the variables of Leadership Style and Organizational Commitment toward Job Satisfaction directly as well as a significant difference between Job Satisfaction on Employee Performance. As for the influence of Leadership Style and variable Organizational Commitment on Employee Performance directly declared insignificant.

  1. Bayesian Population Physiologically-Based Pharmacokinetic (PBPK Approach for a Physiologically Realistic Characterization of Interindividual Variability in Clinically Relevant Populations.

    Directory of Open Access Journals (Sweden)

    Markus Krauss

    Full Text Available Interindividual variability in anatomical and physiological properties results in significant differences in drug pharmacokinetics. The consideration of such pharmacokinetic variability supports optimal drug efficacy and safety for each single individual, e.g. by identification of individual-specific dosings. One clear objective in clinical drug development is therefore a thorough characterization of the physiological sources of interindividual variability. In this work, we present a Bayesian population physiologically-based pharmacokinetic (PBPK approach for the mechanistically and physiologically realistic identification of interindividual variability. The consideration of a generic and highly detailed mechanistic PBPK model structure enables the integration of large amounts of prior physiological knowledge, which is then updated with new experimental data in a Bayesian framework. A covariate model integrates known relationships of physiological parameters to age, gender and body height. We further provide a framework for estimation of the a posteriori parameter dependency structure at the population level. The approach is demonstrated considering a cohort of healthy individuals and theophylline as an application example. The variability and co-variability of physiological parameters are specified within the population; respectively. Significant correlations are identified between population parameters and are applied for individual- and population-specific visual predictive checks of the pharmacokinetic behavior, which leads to improved results compared to present population approaches. In the future, the integration of a generic PBPK model into an hierarchical approach allows for extrapolations to other populations or drugs, while the Bayesian paradigm allows for an iterative application of the approach and thereby a continuous updating of physiological knowledge with new data. This will facilitate decision making e.g. from preclinical to

  2. A Bayesian approach for the categorization of radiology reports.

    Science.gov (United States)

    Pyrros, Ayis; Nikolaidis, Paul; Yaghmai, Vahid; Zivin, Steve; Tracy, Joseph I; Flanders, Adam

    2007-04-01

    We sought to develop a Bayesian-filter that could distinguish positive radiology computed tomography (CT) reports of appendicitis from negative reports with no appendicitis. Standard unstructured electronic text radiology reports containing the key word appendicitis were obtained using a Java-based text search engine from a hospital General Electric PACS system. A total of 500 randomly selected reports from multiple radiologists were then manually categorized and merged into two separate text files: 250 positive reports and 250 negative findings of appendicitis. The two text files were then processed by the freely available UNIX-based software dbacl 1.9, a digramic Bayesian classifier for text recognition, on a Linux based Pentium 4 system. The software was then trained on the two separate merged text files categories of positive and negative appendicitis. The ability of the Bayesian filter to discriminate between reports of negative and positive appendicitis images was then tested on 100 randomly selected reports of appendicitis: 50 positive cases and 50 negative cases. The training time for the Bayesian filter was approximately 2 seconds. The Bayesian filter subsequently was able to categorize 50 of 50 positive reports of appendicitis and 50 of 50 reports of negative appendicitis, in less than 10 seconds. A Bayesian-filter system can be used to quickly categorize radiology report findings and automatically determine after training, with a high degree of accuracy, whether the reports have text findings of a specific diagnosis. The Bayesian filter can potentially be applied to any type of radiologic report finding and any relevant category.

  3. A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning: e0141965

    National Research Council Canada - National Science Library

    Michael Jae-Yoon Chung; Abram L Friesen; Dieter Fox; Andrew N Meltzoff; Rajesh P N Rao

    2015-01-01

    .... We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation...

  4. A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning

    National Research Council Canada - National Science Library

    Chung, Michael Jae-Yoon; Friesen, Abram L; Fox, Dieter; Meltzoff, Andrew N; Rao, Rajesh P N

    2015-01-01

    .... We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation...

  5. When mechanism matters: Bayesian forecasting using models of ecological diffusion.

    Science.gov (United States)

    Hefley, Trevor J; Hooten, Mevin B; Russell, Robin E; Walsh, Daniel P; Powell, James A

    2017-05-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting. © 2017 John Wiley & Sons Ltd/CNRS.

  6. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    Science.gov (United States)

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  7. Global, regional, and subregional classification of abortions by safety, 2010-14: estimates from a Bayesian hierarchical model.

    Science.gov (United States)

    Ganatra, Bela; Gerdts, Caitlin; Rossier, Clémentine; Johnson, Brooke Ronald; Tunçalp, Özge; Assifi, Anisa; Sedgh, Gilda; Singh, Susheela; Bankole, Akinrinola; Popinchalk, Anna; Bearak, Jonathan; Kang, Zhenning; Alkema, Leontine

    2017-11-25

    Global estimates of unsafe abortions have been produced for 1995, 2003, and 2008. However, reconceptualisation of the framework and methods for estimating abortion safety is needed owing to the increased availability of simple methods for safe abortion (eg, medical abortion), the increasingly widespread use of misoprostol outside formal health systems in contexts where abortion is legally restricted, and the need to account for the multiple factors that affect abortion safety. We used all available empirical data on abortion methods, providers, and settings, and factors affecting safety as covariates within a Bayesian hierarchical model to estimate the global, regional, and subregional distributions of abortion by safety categories. We used a three-tiered categorisation based on the WHO definition of unsafe abortion and WHO guidelines on safe abortion to categorise abortions as safe or unsafe and to further divide unsafe abortions into two categories of less safe and least safe. Of the 55· 7 million abortions that occurred worldwide each year between 2010-14, we estimated that 30·6 million (54·9%, 90% uncertainty interval 49·9-59·4) were safe, 17·1 million (30·7%, 25·5-35·6) were less safe, and 8·0 million (14·4%, 11·5-18·1) were least safe. Thus, 25·1 million (45·1%, 40·6-50·1) abortions each year between 2010 and 2014 were unsafe, with 24·3 million (97%) of these in developing countries. The proportion of unsafe abortions was significantly higher in developing countries than developed countries (49·5% vs 12·5%). When grouped by the legal status of abortion, the proportion of unsafe abortions was significantly higher in countries with highly restrictive abortion laws than in those with less restrictive laws. Increased efforts are needed, especially in developing countries, to ensure access to safe abortion. The paucity of empirical data is a limitation of these findings. Improved in-country data for health services and innovative research to

  8. Nursing Home Care Quality: Insights from a Bayesian Network Approach

    Science.gov (United States)

    Goodson, Justin; Jang, Wooseung; Rantz, Marilyn

    2008-01-01

    Purpose: The purpose of this research is twofold. The first purpose is to utilize a new methodology (Bayesian networks) for aggregating various quality indicators to measure the overall quality of care in nursing homes. The second is to provide new insight into the relationships that exist among various measures of quality and how such measures…

  9. Predicting downturns in the US housing market: a Bayesian approach

    CSIR Research Space (South Africa)

    Gupta, R

    2010-10-01

    Full Text Available This paper estimates Bayesian Vector Autoregressive (BVAR) models, both spatial and non-spatial (univariate and multivariate), for the twenty largest states of the US economy, using quarterly data over the period 1976:Q1–1994:Q 4; and then forecasts...

  10. Basics of Bayesian methods.

    Science.gov (United States)

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  11. New Statistical Approach to the Analysis of Hierarchical Data

    Science.gov (United States)

    Neuman, S. P.; Guadagnini, A.; Riva, M.

    2014-12-01

    Many variables possess a hierarchical structure reflected in how their increments vary in space and/or time. Quite commonly the increments (a) fluctuate in a highly irregular manner; (b) possess symmetric, non-Gaussian frequency distributions characterized by heavy tails that often decay with separation distance or lag; (c) exhibit nonlinear power-law scaling of sample structure functions in a midrange of lags, with breakdown in such scaling at small and large lags; (d) show extended power-law scaling (ESS) at all lags; and (e) display nonlinear scaling of power-law exponent with order of sample structure function. Some interpret this to imply that the variables are multifractal, which explains neither breakdowns in power-law scaling nor ESS. We offer an alternative interpretation consistent with all above phenomena. It views data as samples from stationary, anisotropic sub-Gaussian random fields subordinated to truncated fractional Brownian motion (tfBm) or truncated fractional Gaussian noise (tfGn). The fields are scaled Gaussian mixtures with random variances. Truncation of fBm and fGn entails filtering out components below data measurement or resolution scale and above domain scale. Our novel interpretation of the data allows us to obtain maximum likelihood estimates of all parameters characterizing the underlying truncated sub-Gaussian fields. These parameters in turn make it possible to downscale or upscale all statistical moments to situations entailing smaller or larger measurement or resolution and sampling scales, respectively. They also allow one to perform conditional or unconditional Monte Carlo simulations of random field realizations corresponding to these scales. Aspects of our approach are illustrated on field and laboratory measured porous and fractured rock permeabilities, as well as soil texture characteristics and neural network estimates of unsaturated hydraulic parameters in a deep vadose zone near Phoenix, Arizona. We also use our approach

  12. A Bayesian Approach to Identifying New Risk Factors for Dementia

    Science.gov (United States)

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-01-01

    Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database. We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia. We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients. PMID:27227925

  13. Genetic evaluation of popcorn families using a Bayesian approach via the independence chain algorithm

    Directory of Open Access Journals (Sweden)

    Marcos Rodovalho

    2014-11-01

    Full Text Available The objective of this study was to examine genetic parameters of popping expansion and grain yield in a trial of 169 halfsib families using a Bayesian approach. The independence chain algorithm with informative priors for the components of residual and family variance (inverse-gamma prior distribution was used. Popping expansion was found to be moderately heritable, with a posterior mode of h2 of 0.34, and 90% Bayesian confidence interval of 0.22 to 0.44. The heritability of grain yield (family level was moderate (h2 = 0.4 with Bayesian confidence interval of 0.28 to 0.49. The target population contains sufficient genetic variability for subsequent breeding cycles, and the Bayesian approach is a useful alternative for scientific inference in the genetic evaluation of popcorn.

  14. Bayesian hierarchical model for transcriptional module discovery by jointly modeling gene expression and ChIP-chip data.

    Science.gov (United States)

    Liu, Xiangdong; Jessen, Walter J; Sivaganesan, Siva; Aronow, Bruce J; Medvedovic, Mario

    2007-08-03

    Transcriptional modules (TM) consist of groups of co-regulated genes and transcription factors (TF) regulating their expression. Two high-throughput (HT) experimental technologies, gene expression microarrays and Chromatin Immuno-Precipitation on Chip (ChIP-chip), are capable of producing data informative about expression regulatory mechanism on a genome scale. The optimal approach to joint modeling of data generated by these two complementary biological assays, with the goal of identifying and characterizing TMs, is an important open problem in computational biomedicine. We developed and validated a novel probabilistic model and related computational procedure for identifying TMs by jointly modeling gene expression and ChIP-chip binding data. We demonstrate an improved functional coherence of the TMs produced by the new method when compared to either analyzing expression or ChIP-chip data separately or to alternative approaches for joint analysis. We also demonstrate the ability of the new algorithm to identify novel regulatory relationships not revealed by ChIP-chip data alone. The new computational procedure can be used in more or less the same way as one would use simple hierarchical clustering without performing any special transformation of data prior to the analysis. The R and C-source code for implementing our algorithm is incorporated within the R package gimmR which is freely available at http://eh3.uc.edu/gimm. Our results indicate that, whenever available, ChIP-chip and expression data should be analyzed within the unified probabilistic modeling framework, which will likely result in improved clusters of co-regulated genes and improved ability to detect meaningful regulatory relationships. Given the good statistical properties and the ease of use, the new computational procedure offers a worthy new tool for reconstructing transcriptional regulatory networks.

  15. New approach using Bayesian Network to improve content based image classification systems

    OpenAIRE

    jayech, Khlifia; mahjoub, mohamed ali

    2012-01-01

    This paper proposes a new approach based on augmented naive Bayes for image classification. Initially, each image is cutting in a whole of blocks. For each block, we compute a vector of descriptors. Then, we propose to carry out a classification of the vectors of descriptors to build a vector of labels for each image. Finally, we propose three variants of Bayesian Networks such as Naive Bayesian Network (NB), Tree Augmented Naive Bayes (TAN) and Forest Augmented Naive Bayes (FAN) to classify ...

  16. A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting

    National Research Council Canada - National Science Library

    Zhaoxuan Li; SM Mahbobur Rahman; Rolando Vega; Bing Dong

    2016-01-01

    .... A hierarchical approach is proposed based on the machine learning algorithms tested. The production data used in this work corresponds to 15 min averaged power measurements collected from 2014...

  17. A Hierarchical Approach to the Classification of Digital Modulation Types in Multipath Environments

    National Research Council Canada - National Science Library

    Fargues, M

    2001-01-01

    ... propagation channel conditions. A hierarchical tree-based classification approach is selected as it leads to a relatively simple overall scheme with few parameters needed to differentiate between the various modulation types...

  18. A Hierarchical Aggregation Approach for Indicators Based on Data Envelopment Analysis and Analytic Hierarchy Process

    National Research Council Canada - National Science Library

    Mohammad Sadegh Pakkar

    2016-01-01

    ...) and Analytic Hierarchy Process (AHP) for indicators. The core logic of the proposed approach is to reflect the hierarchical structures of indicators and their relative priorities in constructing composite indicators (CIs), simultaneously...

  19. Bayesian penalized log-likelihood ratio approach for dose response clinical trial studies.

    Science.gov (United States)

    Tang, Yuanyuan; Cai, Chunyan; Sun, Liangrui; He, Jianghua

    2017-02-13

    In literature, there are a few unified approaches to test proof of concept and estimate a target dose, including the multiple comparison procedure using modeling approach, and the permutation approach proposed by Klingenberg. We discuss and compare the operating characteristics of these unified approaches and further develop an alternative approach in a Bayesian framework based on the posterior distribution of a penalized log-likelihood ratio test statistic. Our Bayesian approach is much more flexible to handle linear or nonlinear dose-response relationships and is more efficient than the permutation approach. The operating characteristics of our Bayesian approach are comparable to and sometimes better than both approaches in a wide range of dose-response relationships. It yields credible intervals as well as predictive distribution for the response rate at a specific dose level for the target dose estimation. Our Bayesian approach can be easily extended to continuous, categorical, and time-to-event responses. We illustrate the performance of our proposed method with extensive simulations and Phase II clinical trial data examples.

  20. Social Influence on Information Technology Adoption and Sustained Use in Healthcare: A Hierarchical Bayesian Learning Method Analysis

    Science.gov (United States)

    Hao, Haijing

    2013-01-01

    Information technology adoption and diffusion is currently a significant challenge in the healthcare delivery setting. This thesis includes three papers that explore social influence on information technology adoption and sustained use in the healthcare delivery environment using conventional regression models and novel hierarchical Bayesian…

  1. A General and Flexible Approach to Estimating the Social Relations Model Using Bayesian Methods

    Science.gov (United States)

    Ludtke, Oliver; Robitzsch, Alexander; Kenny, David A.; Trautwein, Ulrich

    2013-01-01

    The social relations model (SRM) is a conceptual, methodological, and analytical approach that is widely used to examine dyadic behaviors and interpersonal perception within groups. This article introduces a general and flexible approach to estimating the parameters of the SRM that is based on Bayesian methods using Markov chain Monte Carlo…

  2. Predicting forest insect flight activity: A Bayesian network approach.

    Science.gov (United States)

    Pawson, Stephen M; Marcot, Bruce G; Woodberry, Owen G

    2017-01-01

    Daily flight activity patterns of forest insects are influenced by temporal and meteorological conditions. Temperature and time of day are frequently cited as key drivers of activity; however, complex interactions between multiple contributing factors have also been proposed. Here, we report individual Bayesian network models to assess the probability of flight activity of three exotic insects, Hylurgus ligniperda, Hylastes ater, and Arhopalus ferus in a managed plantation forest context. Models were built from 7,144 individual hours of insect sampling, temperature, wind speed, relative humidity, photon flux density, and temporal data. Discretized meteorological and temporal variables were used to build naïve Bayes tree augmented networks. Calibration results suggested that the H. ater and A. ferus Bayesian network models had the best fit for low Type I and overall errors, and H. ligniperda had the best fit for low Type II errors. Maximum hourly temperature and time since sunrise had the largest influence on H. ligniperda flight activity predictions, whereas time of day and year had the greatest influence on H. ater and A. ferus activity. Type II model errors for the prediction of no flight activity is improved by increasing the model's predictive threshold. Improvements in model performance can be made by further sampling, increasing the sensitivity of the flight intercept traps, and replicating sampling in other regions. Predicting insect flight informs an assessment of the potential phytosanitary risks of wood exports. Quantifying this risk allows mitigation treatments to be targeted to prevent the spread of invasive species via international trade pathways.

  3. Testing gender invariance of the hospital anxiety and depression scale using the classical approach and Bayesian approach.

    Science.gov (United States)

    Fong, Ted C T; Ho, Rainbow T H

    2014-06-01

    Measurement invariance is an important attribute for the Hospital Anxiety and Depression Scale (HADS). Most of the confirmatory factor analysis studies on the HADS adopt the classical maximum likelihood approach. The restrictive assumptions of exact-zero cross-loadings and residual correlations in the classical approach can lead to inadequate model fit and biased parameter estimates. The present study adopted both the classical approach and the alternative Bayesian approach to examine the measurement and structural invariance of the HADS across gender. A Chinese sample of 326 males and 427 females was used to examine the two-factor model of the HADS across gender. Configural and scalar invariance of the HADS were evaluated using the classical approach with the robust-weighted least-square estimator and the Bayesian approach with zero-mean, small-variance informative priors to cross-loadings and residual correlations. Acceptable and excellent model fits were found for the two-factor model under the classical and Bayesian approaches, respectively. The two-factor model displayed scalar invariance across gender using both approaches. In terms of structural invariance, females showed a significantly higher mean in the anxiety factor than males under both approaches. The HADS demonstrated measurement invariance across gender and appears to be a well-developed instrument for assessment of anxiety and depression. The Bayesian approach is an alternative and flexible tool that could be used in future invariance studies.

  4. Hierarchical Swarm Model: A New Approach to Optimization

    Directory of Open Access Journals (Sweden)

    Hanning Chen

    2010-01-01

    Full Text Available This paper presents a novel optimization model called hierarchical swarm optimization (HSO, which simulates the natural hierarchical complex system from where more complex intelligence can emerge for complex problems solving. This proposed model is intended to suggest ways that the performance of HSO-based algorithms on complex optimization problems can be significantly improved. This performance improvement is obtained by constructing the HSO hierarchies, which means that an agent in a higher level swarm can be composed of swarms of other agents from lower level and different swarms of different levels evolve on different spatiotemporal scale. A novel optimization algorithm (named PS2O, based on the HSO model, is instantiated and tested to illustrate the ideas of HSO model clearly. Experiments were conducted on a set of 17 benchmark optimization problems including both continuous and discrete cases. The results demonstrate remarkable performance of the PS2O algorithm on all chosen benchmark functions when compared to several successful swarm intelligence and evolutionary algorithms.

  5. Homicide or suicide? Gunshot wound interpretation: a Bayesian approach.

    Science.gov (United States)

    Cave, Rowena; DiMaio, Vincent J; Molina, D Kimberley

    2014-06-01

    Many studies have been published examining various features of fatal gunshot wounds such as type of firearm, range of fire, number of shots, and wound location as a way of determining between homicidal and suicidal deaths. Pathologists frequently have to give evidence in court, and may have their opinion about probable manner of death challenged or be questioned about how sure they can be. In the literature, the features are always discussed in isolation, but in practice, the pathologist has to consider such details in combination. Using pooled data from a systematic review to obtain large data sets, this study shows how Bayesian analysis can be applied to consideration of combined features and can thus provide a quantified degree of confidence to support the pathologist's opinion through the use of likelihood ratios. Case examples are provided to illustrate the impact of different features.

  6. Professional Growth Determinants--Comparing Bayesian and Linear Approaches to Classification.

    Science.gov (United States)

    Nokelainen, Petri; Ruohotie, Pekka; Tirri, Henry

    Bayesian and classical approaches to classification of vocational data were compared using an educational data set from a longitudinal study of professional growth and development in organizations (P. Ruohotie et al., 1994). Data were from 2,430 workers in companies in Finland who completed a questionnaire with behavior and background statements.…

  7. Pedestrian fatality and natural light: Evidence from South Africa using a Bayesian approach

    CSIR Research Space (South Africa)

    Das, Sonali

    2014-02-01

    Full Text Available In this paper we use a Bayesian approach to investigate the relationship between pedestrian fatality records from Tshwane and time of fatality. Time of fatality is used as a proxy to reflect the presence of effective lighting, not precluding...

  8. Inventory control of spare parts using a Bayesian approach: a case study

    NARCIS (Netherlands)

    K-P. Aronis; I. Magou (Ioulia); R. Dekker (Rommert); G. Tagaras (George)

    1999-01-01

    textabstractThis paper presents a case study of applying a Bayesian approach to forecast demand and subsequently determine the appropriate parameter S of an (S-1,S) inventory system for controlling spare parts of electronic equipment. First, the problem and the current policy are described. Then,

  9. A Bayesian Approach for Nonlinear Structural Equation Models with Dichotomous Variables Using Logit and Probit Links

    Science.gov (United States)

    Lee, Sik-Yum; Song, Xin-Yuan; Cai, Jing-Heng

    2010-01-01

    Analysis of ordered binary and unordered binary data has received considerable attention in social and psychological research. This article introduces a Bayesian approach, which has several nice features in practical applications, for analyzing nonlinear structural equation models with dichotomous data. We demonstrate how to use the software…

  10. A Bayesian Approach to Period Searching in Solar Coronal Loops

    Science.gov (United States)

    Scherrer, Bryan; McKenzie, David

    2017-03-01

    We have applied a Bayesian generalized Lomb-Scargle period searching algorithm to movies of coronal loop images obtained with the Hinode X-ray Telescope (XRT) to search for evidence of periodicities that would indicate resonant heating of the loops. The algorithm makes as its only assumption that there is a single sinusoidal signal within each light curve of the data. Both the amplitudes and noise are taken as free parameters. It is argued that this procedure should be used alongside Fourier and wavelet analyses to more accurately extract periodic intensity modulations in coronal loops. The data analyzed are from XRT Observation Program #129C: “MHD Wave Heating (Thin Filters),” which occurred during 2006 November 13 and focused on active region 10293, which included coronal loops. The first data set spans approximately 10 min with an average cadence of 2 s, 2″ per pixel resolution, and used the Al-mesh analysis filter. The second data set spans approximately 4 min with a 3 s average cadence, 1″ per pixel resolution, and used the Al-poly analysis filter. The final data set spans approximately 22 min at a 6 s average cadence, and used the Al-poly analysis filter. In total, 55 periods of sinusoidal coronal loop oscillations between 5.5 and 59.6 s are discussed, supporting proposals in the literature that resonant absorption of magnetic waves is a viable mechanism for depositing energy in the corona.

  11. A multiscale Bayesian data integration approach for mapping air dose rates around the Fukushima Daiichi Nuclear Power Plant.

    Science.gov (United States)

    Wainwright, Haruko M; Seki, Akiyuki; Chen, Jinsong; Saito, Kimiaki

    2017-02-01

    This paper presents a multiscale data integration method to estimate the spatial distribution of air dose rates in the regional scale around the Fukushima Daiichi Nuclear Power Plant. We integrate various types of datasets, such as ground-based walk and car surveys, and airborne surveys, all of which have different scales, resolutions, spatial coverage, and accuracy. This method is based on geostatistics to represent spatial heterogeneous structures, and also on Bayesian hierarchical models to integrate multiscale, multi-type datasets in a consistent manner. The Bayesian method allows us to quantify the uncertainty in the estimates, and to provide the confidence intervals that are critical for robust decision-making. Although this approach is primarily data-driven, it has great flexibility to include mechanistic models for representing radiation transport or other complex correlations. We demonstrate our approach using three types of datasets collected at the same time over Fukushima City in Japan: (1) coarse-resolution airborne surveys covering the entire area, (2) car surveys along major roads, and (3) walk surveys in multiple neighborhoods. Results show that the method can successfully integrate three types of datasets and create an integrated map (including the confidence intervals) of air dose rates over the domain in high resolution. Moreover, this study provides us with various insights into the characteristics of each dataset, as well as radiocaesium distribution. In particular, the urban areas show high heterogeneity in the contaminant distribution due to human activities as well as large discrepancy among different surveys due to such heterogeneity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. A hierarchical bayesian analysis of parasite prevalence and sociocultural outcomes: The role of structural racism and sanitation infrastructure.

    Science.gov (United States)

    Ross, Cody T; Winterhalder, Bruce

    2016-01-01

    We conduct a revaluation of the Thornhill and Fincher research project on parasites using finely-resolved geographic data on parasite prevalence, individual-level sociocultural data, and multilevel Bayesian modeling. In contrast to the evolutionary psychological mechanisms linking parasites to human behavior and cultural characteristics proposed by Thornhill and Fincher, we offer an alternative hypothesis that structural racism and differential access to sanitation systems drive both variation in parasite prevalence and differential behaviors and cultural characteristics. We adopt a Bayesian framework to estimate parasite prevalence rates in 51 districts in eight Latin American countries using the disease status of 170,220 individuals tested for infection with the intestinal roundworm Ascaris lumbricoides (Hürlimann et al., []: PLoS Negl Trop Dis 5:e1404). We then use district-level estimates of parasite prevalence and individual-level social data from 5,558 individuals in the same 51 districts (Latinobarómetro, 2008) to assess claims of causal associations between parasite prevalence and sociocultural characteristics. We find, contrary to Thornhill and Fincher, that parasite prevalence is positively associated with preferences for democracy, negatively associated with preferences for collectivism, and not associated with violent crime rates or gender inequality. A positive association between parasite prevalence and religiosity, as in Fincher and Thornhill (: Behav Brain Sci 35:61-79), and a negative association between parasite prevalence and achieved education, as predicted by Eppig et al. (: Proc R S B: Biol Sci 277:3801-3808), become negative and unreliable when reasonable controls are included in the model. We find support for all predictions derived from our hypothesis linking structural racism to both parasite prevalence and cultural outcomes. We conclude that best practices in biocultural modeling require examining more than one hypothesis, retaining

  13. A Bayesian approach to gene-gene and gene-environment interactions in chronic fatigue syndrome.

    Science.gov (United States)

    Lin, Eugene; Hsu, Sen-Yen

    2009-01-01

    In the study of genomics, it is essential to address gene-gene and gene-environment interactions for describing the complex traits that involves disease-related mechanisms. In this work, our goal is to detect gene-gene and gene-environment interactions resulting from the analysis of chronic fatigue syndrome patients' genetic and demographic factors including SNPs, age, gender and BMI. We employed the dataset that was original to the previous study by the Centers for Disease Control and Prevention Chronic Fatigue Syndrome Research Group. To investigate gene-gene and gene-environment interactions, we implemented a Bayesian based method for identifying significant interactions between factors. Here, we employed a two-stage Bayesian variable selection methodology based on Markov Chain Monte Carlo approaches. By applying our Bayesian based approach, NR3C1 was found in the significant two-locus gene-gene effect model, as well as in the significant two-factor gene-environment effect model. Furthermore, a significant gene-environment interaction was identified between NR3C1 and gender. These results support the hypothesis that NR3C1 and gender may play a role in biological mechanisms associated with chronic fatigue syndrome. We demonstrated that our Bayesian based approach is a promising method to assess the gene-gene and gene-environment interactions in chronic fatigue syndrome patients by using genetic factors, such as SNPs, and demographic factors such as age, gender and BMI.

  14. A Bayesian Network approach for flash flood risk assessment

    Science.gov (United States)

    Boutkhamouine, Brahim; Roux, Hélène; Pérès, François

    2017-04-01

    Climate change is contributing to the increase of natural disasters such as extreme weather events. Sometimes, these events lead to sudden flash floods causing devastating effects on life and property. Most recently, many regions of the French Mediterranean perimeter have endured such catastrophic flood events; Var (October 2015), Ardèche (November 2014), Nîmes (October 2014), Hérault, Gard and Languedoc (September 2014), and Pyrenees mountains (Jun 2013). Altogether, it resulted in dozens of victims and property damages amounting to millions of euros. With this heavy loss in mind, development of hydrological forecasting and warning systems is becoming an essential element in regional and national strategies. Flash flood forecasting but also monitoring is a difficult task because small ungauged catchments ( 10 km2) are often the most destructive ones as for the extreme flash flood event of September 2002 in the Cévennes region (France) (Ruin et al., 2008). The problem of measurement/prediction uncertainty is particularly crucial when attempting to develop operational flash-flood forecasting methods. Taking into account the uncertainty related to the model structure itself, to the model parametrization or to the model forcing (spatio-temporal rainfall, initial conditions) is crucial in hydrological modelling. Quantifying these uncertainties is of primary importance for risk assessment and decision making. Although significant improvements have been made in computational power and distributed hydrologic modelling, the issue dealing with integration of uncertainties into flood forecasting remains up-to-date and challenging. In order to develop a framework which could handle these uncertainties and explain their propagation through the model, we propose to explore the potential of graphical models (GMs) and, more precisely, Bayesian Networks (BNs). These networks are Directed Acyclic Graphs (DAGs) in which knowledge of a certain phenomenon is represented by

  15. A Hierarchical Approach to Persistent Scatterer Network Construction and Deformation Time Series Estimation

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2014-12-01

    Full Text Available This paper presents a hierarchical approach to network construction and time series estimation in persistent scatterer interferometry (PSI for deformation analysis using the time series of high-resolution satellite SAR images. To balance between computational efficiency and solution accuracy, a dividing and conquering algorithm (i.e., two levels of PS networking and solution is proposed for extracting deformation rates of a study area. The algorithm has been tested using 40 high-resolution TerraSAR-X images collected between 2009 and 2010 over Tianjin in China for subsidence analysis, and validated by using the ground-based leveling measurements. The experimental results indicate that the hierarchical approach can remarkably reduce computing time and memory requirements, and the subsidence measurements derived from the hierarchical solution are in good agreement with the leveling data.

  16. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    OpenAIRE

    Salinas, Jos? Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Bl?schl, G?nter

    2016-01-01

    Abstract This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included...

  17. Quantitative Precipitation Estimation over Ocean Using Bayesian Approach from Microwave Observations during the Typhoon Season

    Directory of Open Access Journals (Sweden)

    Jen-Chi Hu

    2009-01-01

    Full Text Available We have developed a new Bayesian approach to retrieve oceanic rain rate from the Tropical Rainfall Measuring Mission (TRMM Microwave Imager (TMI, with an emphasis on typhoon cases in the West Pacific. Retrieved rain rates are validated with measurements of rain gauges located on Japanese islands. To demonstrate improvement, retrievals are also compared with those from the TRMM/Precipitation Radar (PR, the Goddard Profiling Algorithm (GPROF, and a multi-channel linear regression statistical method (MLRS. We have found that qualitatively, all methods retrieved similar horizontal distributions in terms of locations of eyes and rain bands of typhoons. Quantitatively, our new Bayesian retrievals have the best linearity and the smallest root mean square (RMS error against rain gauge data for 16 typhoon over passes in 2004. The correlation coefficient and RMS of our retrievals are 0.95 and ~2 mm hr-1, respectively. In particular, at heavy rain rates, our Bayesian retrievals out perform those retrieved from GPROF and MLRS. Over all, the new Bayesian approach accurately retrieves surface rain rate for typhoon cases. Ac cu rate rain rate estimates from this method can be assimilated in models to improve forecast and prevent potential damages in Taiwan during typhoon seasons.

  18. Bayesian Selection for the ℓ _2 -Potts Model Regularization Parameter: 1-D Piecewise Constant Signal Denoising

    Science.gov (United States)

    Frecon, Jordan; Pustelnik, Nelly; Dobigeon, Nicolas; Wendt, Herwig; Abry, Patrice

    2017-10-01

    Piecewise constant denoising can be solved either by deterministic optimization approaches, based on the Potts model, or by stochastic Bayesian procedures. The former lead to low computational time but require the selection of a regularization parameter, whose value significantly impacts the achieved solution, and whose automated selection remains an involved and challenging problem. Conversely, fully Bayesian formalisms encapsulate the regularization parameter selection into hierarchical models, at the price of high computational costs. This contribution proposes an operational strategy that combines hierarchical Bayesian and Potts model formulations, with the double aim of automatically tuning the regularization parameter and of maintaining computational effciency. The proposed procedure relies on formally connecting a Bayesian framework to a l2-Potts functional. Behaviors and performance for the proposed piecewise constant denoising and regularization parameter tuning techniques are studied qualitatively and assessed quantitatively, and shown to compare favorably against those of a fully Bayesian hierarchical procedure, both in accuracy and in computational load.

  19. A Bayesian network approach to the database search problem in criminal proceedings

    Science.gov (United States)

    2012-01-01

    Background The ‘database search problem’, that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions

  20. The association between smoking or passive smoking and cardiovascular diseases using a Bayesian hierarchical model: based on the 2008-2013 Korea Community Health Survey.

    Science.gov (United States)

    Lee, Whanhee; Hwang, Sung-Hee; Choi, Hayoung; Kim, Ho

    2017-01-01

    Smoking and passive smoking have been extensively reported as risk factors of cardiovascular morbidity and mortality. Despite the biological mechanisms underlying the impact of hazardous chemical substances contained in tobacco in cardiovascular diseases (CVD), studies investigating the association between smoking and passive smoking with morbidity are at an inchoate stage in Korea. Therefore, this study aimed to estimate the risks of smoking and passive smoking on cardiovascular morbidity at the national and regional levels. This study calculated sex-standardized and age-standardized prevalence of CVD and smoking indices in 253 community health centers (si/gun/gu) in Korea using the 2008-2013 Korea Community Health Survey data. Furthermore, a Bayesian hierarchical model was used to estimate the association of smoking and passive smoking with the prevalence of CVD from the national and regional community health centers. At the national level, smoking was significantly associated with stroke (relative risk [RR], 1.060) and hypertension (RR, 1.016) prevalence, whilst passive smoking at home and work were also significantly associated with prevalence of stroke (RR, 1.037/1.013), angina (RR, 1.016/1.006), and hypertension (RR, 1.010/1.004). Furthermore, the effects of smoking and passive smoking were greater in urban-industrial areas than in rural areas. The findings of this study would provide grounds for national policies that limit smoking and passive smoking, as well as regionally serve as the basis for region-specific healthcare policies in populations with high CVD vulnerability.

  1. Using hierarchical Bayesian multi-species mixture models to estimate tandem hoop-net based habitat associations and detection probabilities of fishes in reservoirs

    Science.gov (United States)

    Stewart, David R.; Long, James M.

    2015-01-01

    Species distribution models are useful tools to evaluate habitat relationships of fishes. We used hierarchical Bayesian multispecies mixture models to evaluate the relationships of both detection and abundance with habitat of reservoir fishes caught using tandem hoop nets. A total of 7,212 fish from 12 species were captured, and the majority of the catch was composed of Channel Catfish Ictalurus punctatus (46%), Bluegill Lepomis macrochirus(25%), and White Crappie Pomoxis annularis (14%). Detection estimates ranged from 8% to 69%, and modeling results suggested that fishes were primarily influenced by reservoir size and context, water clarity and temperature, and land-use types. Species were differentially abundant within and among habitat types, and some fishes were found to be more abundant in turbid, less impacted (e.g., by urbanization and agriculture) reservoirs with longer shoreline lengths; whereas, other species were found more often in clear, nutrient-rich impoundments that had generally shorter shoreline length and were surrounded by a higher percentage of agricultural land. Our results demonstrated that habitat and reservoir characteristics may differentially benefit species and assemblage structure. This study provides a useful framework for evaluating capture efficiency for not only hoop nets but other gear types used to sample fishes in reservoirs.

  2. A Bayesian hierarchical mixture model for platelet derived growth factor receptor phosphorylation to improve estimation of progression-free survival in prostate cancer

    Science.gov (United States)

    Morita, Satoshi; Thall, Peter F.; Bekele, B. Nebiyou; Mathew, Paul

    2010-01-01

    SUMMARY Advances in understanding the biological underpinnings of many cancers have led increasingly to the use of molecularly targeted anti-cancer therapies. Because the platelet-derived growth factor receptor (PDGFR) has been implicated in the progression of prostate cancer bone metastases, it is of great interest to examine possible relationships between PDGFR inhibition and therapeutic outcomes. Here, we analyze the association between change in activated PDGFR (p-PDGFR) and progression free survival (PFS) time based on large within-patient samples of cell-specific p-PDGFR values taken before and after treatment from each of 88 prostate cancer patients. To utilize these paired samples as covariate data in a regression model for PFS time, and because the p-PDGFR distributions are bimodal, we first employ a Bayesian hierarchical mixture model to obtain a deconvolution of the pre-treatment and post-treatment within-patient p-PDGFR distributions. We evaluate fits of the mixture model and a non-mixture model that ignores the bimodality by using a supnorm metric to compare the empirical distribution of each p-PDGFR data set with the corresponding fitted distribution under each model. Our results show that first using the mixture model to account for the bimodality of the within-patient p-PDGFR distributions, and then using the posterior within-patient component mean changes in p-PDGFR so obtained as covariates in the regression model for PFS time provides an improved estimation. PMID:20390057

  3. A new method for E-government procurement using collaborative filtering and Bayesian approach.

    Science.gov (United States)

    Zhang, Shuai; Xi, Chengyu; Wang, Yan; Zhang, Wenyu; Chen, Yanhong

    2013-01-01

    Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP) to search for the optimal procurement scheme (OPS). Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services' attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach.

  4. A New Method for E-Government Procurement Using Collaborative Filtering and Bayesian Approach

    Directory of Open Access Journals (Sweden)

    Shuai Zhang

    2013-01-01

    Full Text Available Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP to search for the optimal procurement scheme (OPS. Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services’ attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach.

  5. Design of multimodal transport networks : A hierarchical approach

    NARCIS (Netherlands)

    Van Nes, R.

    2002-01-01

    Multimodal transport, that is using two or more transport modes for a trip between which a transfer is necessary, seems an interesting approach to solving today's transportation problems with respect to the deteriorating accessibility of city centres, recurrent congestion, and environmental impact.

  6. The Hierarchical Database Decomposition Approach to Database Concurrency Control.

    Science.gov (United States)

    1984-12-01

    approach, we postulate a model of transaction behavior under two phase locking as shown in Figure 39(a) and a model of that under multiversion ...transaction put in the block queue until it is reactivated. Under multiversion timestamping, however, the request is always granted. Once the request

  7. Hierarchical brain networks active in approach and avoidance goal pursuit

    Directory of Open Access Journals (Sweden)

    Jeffrey Martin Spielberg

    2013-06-01

    Full Text Available Effective approach/avoidance goal pursuit is critical for attaining long-term health and well-being. Research on the neural correlates of key goal pursuit processes (e.g., motivation has long been of interest, with lateralization in prefrontal cortex being a particularly fruitful target of investigation. However, this literature has often been limited by a lack of spatial specificity and has not delineated the precise aspects of approach/avoidance motivation involved. Additionally, the relationships among brain regions (i.e., network connectivity vital to goal pursuit remain largely unexplored. Specificity in location, process, and network relationship is vital for moving beyond gross characterizations of function and identifying the precise cortical mechanisms involved in motivation. The present paper integrates research using more spatially specific methodologies (e.g., functional magnetic resonance imaging with the rich psychological literature on approach/avoidance to propose an integrative network model that takes advantage of the strengths of each of these literatures.

  8. A Hierarchical Approach to Optimizing Bus Stop Distribution in Large and Fast Developing Cities

    Directory of Open Access Journals (Sweden)

    Zhengdong Huang

    2014-04-01

    Full Text Available Public transit plays a key role in shaping the transportation structure of large and fast growing cities. To cope with high population and employment density, such cities usually resort to multi-modal transit services, such as rail, BRT and bus. These modes are strategically connected to form an effective transit network. Among the transit modes, bus stops need to be properly deployed to maintain an acceptable walking accessibility. This paper presents a hierarchical process for optimizing bus stop locations in the context of fast growing multi-modal transit services. Three types of bus stops are identified hierarchically, which includes connection stops, key stops and ordinary stops. Connection stops are generated manually to connect with other transit facilities. Key stops and ordinary stops are optimized with coverage models that are respectively weighted by network centrality measure and potential demand. A case study in a Chinese city suggests the hierarchical approach may generate more effective stop distribution.

  9. Bayesian approach for three-dimensional aquifer characterization at the Hanford 300 Area

    Science.gov (United States)

    Murakami, H.; Chen, X.; Hahn, M. S.; Liu, Y.; Rockhold, M. L.; Vermeul, V. R.; Zachara, J. M.; Rubin, Y.

    2010-10-01

    This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within the Hanford 300 Area, Washington, USA, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF) measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD), to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are its ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from the EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.

  10. Bayesian approach for three-dimensional aquifer characterization at the Hanford 300 Area

    Directory of Open Access Journals (Sweden)

    H. Murakami

    2010-10-01

    Full Text Available This study presents a stochastic, three-dimensional characterization of a heterogeneous hydraulic conductivity field within the Hanford 300 Area, Washington, USA, by assimilating large-scale, constant-rate injection test data with small-scale, three-dimensional electromagnetic borehole flowmeter (EBF measurement data. We first inverted the injection test data to estimate the transmissivity field, using zeroth-order temporal moments of pressure buildup curves. We applied a newly developed Bayesian geostatistical inversion framework, the method of anchored distributions (MAD, to obtain a joint posterior distribution of geostatistical parameters and local log-transmissivities at multiple locations. The unique aspects of MAD that make it suitable for this purpose are its ability to integrate multi-scale, multi-type data within a Bayesian framework and to compute a nonparametric posterior distribution. After we combined the distribution of transmissivities with depth-discrete relative-conductivity profile from the EBF data, we inferred the three-dimensional geostatistical parameters of the log-conductivity field, using the Bayesian model-based geostatistics. Such consistent use of the Bayesian approach throughout the procedure enabled us to systematically incorporate data uncertainty into the final posterior distribution. The method was tested in a synthetic study and validated using the actual data that was not part of the estimation. Results showed broader and skewed posterior distributions of geostatistical parameters except for the mean, which suggests the importance of inferring the entire distribution to quantify the parameter uncertainty.

  11. An informative prior probability distribution of the gompertz parameters for bayesian approaches in paleodemography.

    Science.gov (United States)

    Sasaki, Tomohiko; Kondo, Osamu

    2016-03-01

    In paleodemography, the Bayesian approach has been suggested to provide an effective means by which mortality profiles of past populations can be adequately estimated, and thus avoid problems of "age-mimicry" inherent in conventional approaches. In this study, we propose an application of the Gompertz model using an "informative" prior probability distribution by revising a recent example of the Bayesian approach based on an "uninformative" distribution. Life-table data of 134 human populations including those of contemporary hunter-gatherers were used to determine the Gompertz parameters of each population. In each population, we used both raw life-table data and the Gompertz parameters to calculate some demographic values such as the mean life-span, to confirm representativeness of the model. Then, the correlation between the two Gompertz parameters (the Strehler-Mildvan correlation) was re-established. We incorporated the correlation into the Bayesian approach as an "informative" prior probability distribution, and tested its effectiveness using simulated data. Our analyses showed that the mean life-span (≥ age 15) and the proportion of living persons aging over 45 were well-reproduced by the Gompertz model. The simulation showed that using the correlation as an informative prior provides a narrower estimation range in the Bayesian approach than does the uninformative prior. The Gompertz model can be assumed to accurately estimate the mean life-span and/or the proportion of old people in a population. We suggest that the Strehler-Mildvan correlation can be used as a useful constraint in demographic reconstructions of past human populations. © 2015 Wiley Periodicals, Inc.

  12. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    Science.gov (United States)

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  13. A Bayesian Approach to Multistage Fitting of the Variation of the Skeletal Age Features

    Directory of Open Access Journals (Sweden)

    Dong Hua

    2009-01-01

    Full Text Available Accurate assessment of skeletal maturity is important clinically. Skeletal age assessment is usually based on features encoded in ossification centers. Therefore, it is critical to design a mechanism to capture as much as possible characteristics of features. We have observed that given a feature, there exist stages of the skeletal age such that the variation pattern of the feature differs in these stages. Based on this observation, we propose a Bayesian cut fitting to describe features in response to the skeletal age. With our approach, appropriate positions for stage separation are determined automatically by a Bayesian approach, and a model is used to fit the variation of a feature within each stage. Our experimental results show that the proposed method surpasses the traditional fitting using only one line or one curve not only in the efficiency and accuracy of fitting but also in global and local feature characterization.

  14. Peering through a dirty window: A Bayesian approach to making mine detection decisions from noisy data

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, Stephen W.

    1998-10-11

    For several reasons, Bayesian parameter estimation is superior to other methods for extracting features of a weak signal from noise. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since ''nuisance parameters'' can be dropped out of the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit for perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian approaches this ideal limit of performance more closely than other methods. A major unsolved problem in landmine detection is the fusion of data from multiple sensor types. Bayesian data fusion is only beginning to be explored as a solution to the problem. In single sensor processes Bayesian analysis can sense multiple parameters from the data stream of the one sensor. It does so by computing a joint probability density function of a set of parameter values from the sensor output. However, there is no inherent requirement that the information must come from a single sensor. If multiple sensors are applied to a single process, where several different parameters are implicit in each sensor output data stream, the joint probability density function of all the parameters of interest can be computed in exactly the same manner as the single sensor case. Thus, it is just as practical to base decisions on multiple sensor outputs as it is for single sensors. This should provide a practical way to combine the outputs of dissimilar sensors, such as ground penetrating radar and electromagnetic induction devices, producing a better detection decision than could be provided by either sensor alone.

  15. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    Science.gov (United States)

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  16. A non-parametric Bayesian approach for clustering and tracking non-stationarities of neural spikes.

    Science.gov (United States)

    Shalchyan, Vahid; Farina, Dario

    2014-02-15

    Neural spikes from multiple neurons recorded in a multi-unit signal are usually separated by clustering. Drifts in the position of the recording electrode relative to the neurons over time cause gradual changes in the position and shapes of the clusters, challenging the clustering task. By dividing the data into short time intervals, Bayesian tracking of the clusters based on Gaussian cluster model has been previously proposed. However, the Gaussian cluster model is often not verified for neural spikes. We present a Bayesian clustering approach that makes no assumptions on the distribution of the clusters and use kernel-based density estimation of the clusters in every time interval as a prior for Bayesian classification of the data in the subsequent time interval. The proposed method was tested and compared to Gaussian model-based approach for cluster tracking by using both simulated and experimental datasets. The results showed that the proposed non-parametric kernel-based density estimation of the clusters outperformed the sequential Gaussian model fitting in both simulated and experimental data tests. Using non-parametric kernel density-based clustering that makes no assumptions on the distribution of the clusters enhances the ability of tracking cluster non-stationarity over time with respect to the Gaussian cluster modeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information

    Science.gov (United States)

    Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-01-01

    Abstract This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non‐fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation. PMID:27840456

  18. A Bayesian approach to estimating variance components within a multivariate generalizability theory framework.

    Science.gov (United States)

    Jiang, Zhehan; Skorupski, William

    2017-12-12

    In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.

  19. A Bayesian joint probability modeling approach for seasonal forecasting of streamflows at multiple sites

    Science.gov (United States)

    Wang, Q. J.; Robertson, D. E.; Chiew, F. H. S.

    2009-05-01

    Seasonal forecasting of streamflows can be highly valuable for water resources management. In this paper, a Bayesian joint probability (BJP) modeling approach for seasonal forecasting of streamflows at multiple sites is presented. A Box-Cox transformed multivariate normal distribution is proposed to model the joint distribution of future streamflows and their predictors such as antecedent streamflows and El Niño-Southern Oscillation indices and other climate indicators. Bayesian inference of model parameters and uncertainties is implemented using Markov chain Monte Carlo sampling, leading to joint probabilistic forecasts of streamflows at multiple sites. The model provides a parametric structure for quantifying relationships between variables, including intersite correlations. The Box-Cox transformed multivariate normal distribution has considerable flexibility for modeling a wide range of predictors and predictands. The Bayesian inference formulated allows the use of data that contain nonconcurrent and missing records. The model flexibility and data-handling ability means that the BJP modeling approach is potentially of wide practical application. The paper also presents a number of statistical measures and graphical methods for verification of probabilistic forecasts of continuous variables. Results for streamflows at three river gauges in the Murrumbidgee River catchment in southeast Australia show that the BJP modeling approach has good forecast quality and that the fitted model is consistent with observed data.

  20. A fuzzy Bayesian approach to flood frequency estimation with imprecise historical information.

    Science.gov (United States)

    Salinas, José Luis; Kiss, Andrea; Viglione, Alberto; Viertl, Reinhard; Blöschl, Günter

    2016-09-01

    This paper presents a novel framework that links imprecision (through a fuzzy approach) and stochastic uncertainty (through a Bayesian approach) in estimating flood probabilities from historical flood information and systematic flood discharge data. The method exploits the linguistic characteristics of historical source material to construct membership functions, which may be wider or narrower, depending on the vagueness of the statements. The membership functions are either included in the prior distribution or the likelihood function to obtain a fuzzy version of the flood frequency curve. The viability of the approach is demonstrated by three case studies that differ in terms of their hydromorphological conditions (from an Alpine river with bedrock profile to a flat lowland river with extensive flood plains) and historical source material (including narratives, town and county meeting protocols, flood marks and damage accounts). The case studies are presented in order of increasing fuzziness (the Rhine at Basel, Switzerland; the Werra at Meiningen, Germany; and the Tisza at Szeged, Hungary). Incorporating imprecise historical information is found to reduce the range between the 5% and 95% Bayesian credibility bounds of the 100 year floods by 45% and 61% for the Rhine and Werra case studies, respectively. The strengths and limitations of the framework are discussed relative to alternative (non-fuzzy) methods. The fuzzy Bayesian inference framework provides a flexible methodology that fits the imprecise nature of linguistic information on historical floods as available in historical written documentation.

  1. A hybrid deterministic-probabilistic approach to model the mechanical response of helically arranged hierarchical strands

    Science.gov (United States)

    Fraldi, M.; Perrella, G.; Ciervo, M.; Bosia, F.; Pugno, N. M.

    2017-09-01

    Very recently, a Weibull-based probabilistic strategy has been successfully applied to bundles of wires to determine their overall stress-strain behaviour, also capturing previously unpredicted nonlinear and post-elastic features of hierarchical strands. This approach is based on the so-called ;Equal Load Sharing (ELS); hypothesis by virtue of which, when a wire breaks, the load acting on the strand is homogeneously redistributed among the surviving wires. Despite the overall effectiveness of the method, some discrepancies between theoretical predictions and in silico Finite Element-based simulations or experimental findings might arise when more complex structures are analysed, e.g. helically arranged bundles. To overcome these limitations, an enhanced hybrid approach is proposed in which the probability of rupture is combined with a deterministic mechanical model of a strand constituted by helically-arranged and hierarchically-organized wires. The analytical model is validated comparing its predictions with both Finite Element simulations and experimental tests. The results show that generalized stress-strain responses - incorporating tension/torsion coupling - are naturally found and, once one or more elements break, the competition between geometry and mechanics of the strand microstructure, i.e. the different cross sections and helical angles of the wires in the different hierarchical levels of the strand, determines the no longer homogeneous stress redistribution among the surviving wires whose fate is hence governed by a ;Hierarchical Load Sharing; criterion.

  2. The Atmospheric Circulation of Hot Jupiters: a Hierarchical Modeling Approach

    Science.gov (United States)

    Komacek, Thaddeus D.; Showman, Adam P.

    2017-10-01

    The atmospheres of extrasolar gas giants that receive strong stellar irradiation, or “hot Jupiters,” are beginning to be characterized as a population. Photometric full-phase light curves of hot Jupiters allow for basic inferences of their atmospheric circulation, providing two key observables. First, they measure the amplitude of brightness variation, which has shown that the fractional brightness temperature difference between the dayside and nightside in the atmospheres of these tidally locked planets can approach unity. Additionally, each planet has a significant observed offset of the brightest point in their light curve, and offsets in the infrared ubiquitously occur before secondary eclipse. These infrared offsets are best explained by strong (~km/s) eastward winds in hot Jupiter atmospheres. Motivated by these observations, we have developed a first-principles analytic theory that predicts dayside-nightside temperature differences and horizontal and vertical wind speeds as a function of incident stellar flux, rotation rate, frictional drag strength, and atmospheric pressure level. To complement and compare with this theory, we have performed a hierarchy of three-dimensional numerical simulations of the atmospheric circulation to explore changes with incident stellar flux, rotation rate, and drag strength. Both the theory and numerical simulations predict that the dayside-nightside temperature differences of hot Jupiters and their wind speeds should increase with increasing incident stellar flux and decrease with increasing drag strength. So far, this has been hinted at in the observed sample of nine hot Jupiter phase curves, but we predict that these broad trends will be robust with a larger observed population. We extend our theory to estimate vertical mixing rates, which is critical for understanding the impact of clouds and disequilibrium chemistry on observations of hot Jupiters. To show the regimes that this theory applies in, we compare

  3. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    Science.gov (United States)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  4. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Directory of Open Access Journals (Sweden)

    Rowena Syn Yin Wong

    Full Text Available There are not many studies that attempt to model intensive care unit (ICU risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU.This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV model. Bayesian Markov Chain Monte Carlo (MCMC simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method.The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05 for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study.Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  5. An Application of Bayesian Approach in Modeling Risk of Death in an Intensive Care Unit.

    Science.gov (United States)

    Wong, Rowena Syn Yin; Ismail, Noor Azina

    2016-01-01

    There are not many studies that attempt to model intensive care unit (ICU) risk of death in developing countries, especially in South East Asia. The aim of this study was to propose and describe application of a Bayesian approach in modeling in-ICU deaths in a Malaysian ICU. This was a prospective study in a mixed medical-surgery ICU in a multidisciplinary tertiary referral hospital in Malaysia. Data collection included variables that were defined in Acute Physiology and Chronic Health Evaluation IV (APACHE IV) model. Bayesian Markov Chain Monte Carlo (MCMC) simulation approach was applied in the development of four multivariate logistic regression predictive models for the ICU, where the main outcome measure was in-ICU mortality risk. The performance of the models were assessed through overall model fit, discrimination and calibration measures. Results from the Bayesian models were also compared against results obtained using frequentist maximum likelihood method. The study involved 1,286 consecutive ICU admissions between January 1, 2009 and June 30, 2010, of which 1,111 met the inclusion criteria. Patients who were admitted to the ICU were generally younger, predominantly male, with low co-morbidity load and mostly under mechanical ventilation. The overall in-ICU mortality rate was 18.5% and the overall mean Acute Physiology Score (APS) was 68.5. All four models exhibited good discrimination, with area under receiver operating characteristic curve (AUC) values approximately 0.8. Calibration was acceptable (Hosmer-Lemeshow p-values > 0.05) for all models, except for model M3. Model M1 was identified as the model with the best overall performance in this study. Four prediction models were proposed, where the best model was chosen based on its overall performance in this study. This study has also demonstrated the promising potential of the Bayesian MCMC approach as an alternative in the analysis and modeling of in-ICU mortality outcomes.

  6. A comparative simulation study of bayesian fitting approaches to intravoxel incoherent motion modeling in diffusion-weighted MRI.

    Science.gov (United States)

    While, Peter T

    2017-12-01

    To assess the performance of various least squares and Bayesian modeling approaches to parameter estimation in intravoxel incoherent motion (IVIM) modeling of diffusion-weighted MRI data. Simulated tissue models of different type (breast/liver) and morphology (discrete/continuous) were used to generate noisy data according to the IVIM model at several signal-to-noise ratios. IVIM parameter maps were generated using six different approaches, including full nonlinear least squares (LSQ), segmented least squares (SEG), Bayesian modeling with a Gaussian shrinkage prior (BSP) and Bayesian modeling with a spatial homogeneity prior (FBM), plus two modified approaches. Estimators were compared by calculating the median absolute percentage error and deviation, and median percentage bias. The Bayesian modeling approaches consistently outperformed the least squares approaches, with lower relative error and deviation, and provided cleaner parameter maps with reduced erroneous heterogeneity. However, a weakness of the Bayesian approaches was exposed, whereby certain tissue features disappeared completely in regions of high parameter uncertainty. Lower error and deviation were generally afforded by FBM compared with BSP, at the cost of higher bias. Bayesian modeling is capable of producing more visually pleasing IVIM parameter maps than least squares approaches, but their potential to mask certain tissue features demands caution during implementation. Magn Reson Med 78:2373-2387, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  7. Sequential Bayesian Detection: A Model-Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V

    2008-12-08

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  8. Sequential Bayesian Detection: A Model-Based Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, E J; Candy, J V

    2007-08-13

    Sequential detection theory has been known for a long time evolving in the late 1940's by Wald and followed by Middleton's classic exposition in the 1960's coupled with the concurrent enabling technology of digital computer systems and the development of sequential processors. Its development, when coupled to modern sequential model-based processors, offers a reasonable way to attack physics-based problems. In this chapter, the fundamentals of the sequential detection are reviewed from the Neyman-Pearson theoretical perspective and formulated for both linear and nonlinear (approximate) Gauss-Markov, state-space representations. We review the development of modern sequential detectors and incorporate the sequential model-based processors as an integral part of their solution. Motivated by a wealth of physics-based detection problems, we show how both linear and nonlinear processors can seamlessly be embedded into the sequential detection framework to provide a powerful approach to solving non-stationary detection problems.

  9. A Defence of the AR4’s Bayesian Approach to Quantifying Uncertainty

    Science.gov (United States)

    Vezer, M. A.

    2009-12-01

    The field of climate change research is a kimberlite pipe filled with philosophic diamonds waiting to be mined and analyzed by philosophers. Within the scientific literature on climate change, there is much philosophical dialogue regarding the methods and implications of climate studies. To this date, however, discourse regarding the philosophy of climate science has been confined predominately to scientific - rather than philosophical - investigations. In this paper, I hope to bring one such issue to the surface for explicit philosophical analysis: The purpose of this paper is to address a philosophical debate pertaining to the expressions of uncertainty in the International Panel on Climate Change (IPCC) Fourth Assessment Report (AR4), which, as will be noted, has received significant attention in scientific journals and books, as well as sporadic glances from the popular press. My thesis is that the AR4’s Bayesian method of uncertainty analysis and uncertainty expression is justifiable on pragmatic grounds: it overcomes problems associated with vagueness, thereby facilitating communication between scientists and policy makers such that the latter can formulate decision analyses in response to the views of the former. Further, I argue that the most pronounced criticisms against the AR4’s Bayesian approach, which are outlined below, are misguided. §1 Introduction Central to AR4 is a list of terms related to uncertainty that in colloquial conversations would be considered vague. The IPCC attempts to reduce the vagueness of its expressions of uncertainty by calibrating uncertainty terms with numerical probability values derived from a subjective Bayesian methodology. This style of analysis and expression has stimulated some controversy, as critics reject as inappropriate and even misleading the association of uncertainty terms with Bayesian probabilities. [...] The format of the paper is as follows. The investigation begins (§2) with an explanation of

  10. A Bayesian approach to solve proton stopping powers from noisy multi-energy CT data.

    Science.gov (United States)

    Lalonde, Arthur; Bär, Esther; Bouchard, Hugo

    2017-10-01

    To propose a new formalism allowing the characterization of human tissues from multienergy computed tomography (MECT) data affected by noise and to evaluate its performance in estimating proton stopping powers (SPR). A recently published formalism based on principal component analysis called eigentissue decomposition (ETD) is adapted to the context of noise using a Bayesian estimator. The method, named Bayesian ETD, uses the maximum a posteriori fractions of eigentissues in each voxel to determine physical parameters relevant for proton beam dose calculation. Simulated dual-energy computed tomography (DECT) data are used to evaluate the performance of the proposed method to estimate SPR and to compare it to the initially proposed maximum-likelihood ETD and to a state-of-the-art ρe  - Z formalism. To test the robustness of each method towards clinical reality, three different levels of noise are implemented, as well as variations in elemental composition and density of reference tissues. The impact of using more than two energy bins to determine SPR is also investigated by simulating MECT data using two to five energy bins. Finally, the impact of using MECT over DECT for range prediction is evaluated using a probabilistic model. For simulated DECT data of reference tissues, the Bayesian ETD approach systematically gives lower root-mean-square (RMS) errors with negligible bias. For a medium level of noise, the RMS errors on SPR are found to be 2.78%, 2.76% and 1.53% for ρe  - Z, maximum-likelihood ETD, and Bayesian ETD, respectively. When variations are introduced to the elemental composition and density, all implemented methods give similar performances at low noise. However, for a medium noise level, the proposed Bayesian method outperforms the two others with a RMS error of 1.94%, compared to 2.79% and 2.78% for ρe  - Z and maximum-likelihood ETD, respectively. When more than two energy spectra are used, the Bayesian ETD is able to reduce RMS error on SPR

  11. Examining gene-environment interactions in comorbid depressive and disruptive behavior disorders using a Bayesian approach.

    Science.gov (United States)

    Adrian, Molly; Kiff, Cara; Glazner, Chris; Kohen, Ruth; Tracy, Julia Helen; Zhou, Chuan; McCauley, Elizabeth; Vander Stoep, Ann

    2015-09-01

    The objective of this study was to apply a Bayesian statistical analytic approach that minimizes multiple testing problems to explore the combined effects of chronic low familial support and variants in 12 candidate genes on risk for a common and debilitating childhood mental health condition. Bayesian mixture modeling was used to examine gene by environment interactions among genetic variants and environmental factors (family support) associated in previous studies with the occurrence of comorbid depression and disruptive behavior disorders youth, using a sample of 255 children. One main effect, variants in the oxytocin receptor (OXTR, rs53576) was associated with increased risk for comorbid disorders. Two significant gene × environment and one signification gene × gene interactions emerged. Variants in the nicotinic acetylcholine receptor α5 subunit (CHRNA5, rs16969968) and in the glucocorticoid receptor chaperone protein FK506 binding protein 5 (FKBP5, rs4713902) interacted with chronic low family support in association with child mental health status. One gene × gene interaction, 5-HTTLPR variant of the serotonin transporter (SERT/SLC6A4) in combination with μ opioid receptor (OPRM1, rs1799971) was associated with comorbid depression and conduct problems. Results indicate that Bayesian modeling is a feasible strategy for conducting behavioral genetics research. This approach, combined with an optimized genetic selection strategy (Vrieze et al., 2012), revealed genetic variants involved in stress regulation (FKBP5, SERT × OPMR), social bonding (OXTR), and nicotine responsivity (CHRNA5) in predicting comorbid status. Copyright © 2015 Elsevier Ltd. All rights reserved.

  12. Modeling uncertainties in estimation of canopy LAI from hyperspectral remote sensing data - A Bayesian approach

    Science.gov (United States)

    Varvia, Petri; Rautiainen, Miina; Seppänen, Aku

    2017-04-01

    Hyperspectral remote sensing data carry information on the leaf area index (LAI) of forests, and thus in principle, LAI can be estimated based on the data by inverting a forest reflectance model. However, LAI is usually not the only unknown in a reflectance model; especially, the leaf spectral albedo and understory reflectance are also not known. If the uncertainties of these parameters are not accounted for, the inversion of a forest reflectance model can lead to biased estimates for LAI. In this paper, we study the effects of reflectance model uncertainties on LAI estimates, and further, investigate whether the LAI estimates could recover from these uncertainties with the aid of Bayesian inference. In the proposed approach, the unknown leaf albedo and understory reflectance are estimated simultaneously with LAI from hyperspectral remote sensing data. The feasibility of the approach is tested with numerical simulation studies. The results show that in the presence of unknown parameters, the Bayesian LAI estimates which account for the model uncertainties outperform the conventional estimates that are based on biased model parameters. Moreover, the results demonstrate that the Bayesian inference can also provide feasible measures for the uncertainty of the estimated LAI.

  13. Examining Gene-Environment Interactions in Comorbid Depressive and Disruptive Behavior Disorders using a Bayesian Approach

    Science.gov (United States)

    Adrian, Molly; Kiff, Cara; Glazner, Chris; Kohen, Ruth; Tracy, Julia Helen; Zhou, Chuan; McCauley, Elizabeth; Stoep, Ann Vander

    2015-01-01

    Objective The objective of this study was to apply a Bayesian statistical analytic approach that minimizes multiple testing problems to explore the combined effects of chronic low familial support and variants in 12 candidate genes on risk for a common and debilitating childhood mental health condition. Method Bayesian mixture modeling was used to examine gene by environment interactions among genetic variants and environmental factors (family support) associated in previous studies with the occurrence of comorbid depression and disruptive behavior disorders youth, using a sample of 255 children. Results One main effects, variants in the oxytocin receptor (OXTR, rs53576) was associated with increased risk for comorbid disorders. Two significant gene x environment and one signification gene x gene interaction emerged. Variants in the nicotinic acetylcholine receptor α5 subunit (CHRNA5, rs16969968) and in the glucocorticoid receptor chaperone protein FK506 binding protein 5 (FKBP5, rs4713902) interacted with chronic low family support in association with child mental health status. One gene x gene interaction, 5-HTTLPR variant of the serotonin transporter (SERT/SLC6A4) in combination with μ opioid receptor (OPRM1, rs1799971) was associated with comorbid depression and conduct problems. Conclusions Results indicate that Bayesian modeling is a feasible strategy for conducting behavioral genetics research. This approach, combined with an optimized genetic selection strategy (Vrieze, Iacono, & McGue, 2012), revealed genetic variants involved in stress regulation ( FKBP5, SERTxOPMR), social bonding (OXTR), and nicotine responsivity (CHRNA5) in predicting comorbid status. PMID:26228411

  14. Textual and visual content-based anti-phishing: a Bayesian approach.

    Science.gov (United States)

    Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin

    2011-10-01

    A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases. © 2011 IEEE

  15. The association between smoking or passive smoking and cardiovascular diseases using a Bayesian hierarchical model: based on the 2008-2013 Korea Community Health Survey

    Directory of Open Access Journals (Sweden)

    Whanhee Lee

    2017-06-01

    Full Text Available OBJECTIVES Smoking and passive smoking have been extensively reported as risk factors of cardiovascular morbidity and mortality. Despite the biological mechanisms underlying the impact of hazardous chemical substances contained in tobacco in cardiovascular diseases (CVD, studies investigating the association between smoking and passive smoking with morbidity are at an inchoate stage in Korea. Therefore, this study aimed to estimate the risks of smoking and passive smoking on cardiovascular morbidity at the national and regional levels. METHODS This study calculated sex-standardized and age-standardized prevalence of CVD and smoking indices in 253 community health centers (si/gun/gu in Korea using the 2008-2013 Korea Community Health Survey data. Furthermore, a Bayesian hierarchical model was used to estimate the association of smoking and passive smoking with the prevalence of CVD from the national and regional community health centers. RESULTS At the national level, smoking was significantly associated with stroke (relative risk [RR], 1.060 and hypertension (RR, 1.016 prevalence, whilst passive smoking at home and work were also significantly associated with prevalence of stroke (RR, 1.037/1.013, angina (RR, 1.016/1.006, and hypertension (RR, 1.010/1.004. Furthermore, the effects of smoking and passive smoking were greater in urban-industrial areas than in rural areas. CONCLUSIONS The findings of this study would provide grounds for national policies that limit smoking and passive smoking, as well as regionally serve as the basis for region-specific healthcare policies in populations with high CVD vulnerability.

  16. Investigation of hit-and-run crash occurrence and severity using real-time loop detector data and hierarchical Bayesian binary logit model with random effects.

    Science.gov (United States)

    Xie, Meiquan; Cheng, Wen; Gill, Gurdiljot Singh; Zhou, Jiao; Jia, Xudong; Choi, Simon

    2017-08-24

    Most of the extensive research dedicated to identifying the influential factors of hit-and-run (HR) crashes has utilized typical maximum likelihood estimation binary logit models, and none have employed real-time traffic data. To fill this gap, this study focused on investigating factors contributing to HR crashes, as well as the severity levels of HR. This study analyzed 4-year crash and real-time loop detector data by employing hierarchical Bayesian models with random effects within a sequential logit structure. In addition to evaluation of the impact of random effects on model fitness and complexity, the prediction capability of the models was examined. Stepwise incremental sensitivity and specificity were calculated and receiver operating characteristic (ROC) curves were utilized to graphically illustrate the predictive performance of the model. Among the real-time flow variables, the average occupancy and speed from the upstream detector were observed to be positively correlated with HR crash possibility. The average upstream speed and speed difference between upstream and downstream speeds were correlated with the occurrence of severe HR crashes. In addition to real-time factors, other variables found influential for HR and severe HR crashes were length of segment, adverse weather conditions, dark lighting conditions with malfunctioning street lights, driving under the influence of alcohol, width of inner shoulder, and nighttime. This study suggests the potential traffic conditions of HR and severe HR occurrence, which refer to relatively congested upstream traffic conditions with high upstream speed and significant speed deviations on long segments. The above findings suggest that traffic enforcement should be directed toward mitigating risky driving under the aforementioned traffic conditions. Moreover, enforcement agencies may employ alcohol checkpoints to counter driving under the influence (DUI) at night. With regard to engineering improvements, wider

  17. F-MAP: A Bayesian approach to infer the gene regulatory network using external hints.

    Science.gov (United States)

    Shahdoust, Maryam; Pezeshk, Hamid; Mahjub, Hossein; Sadeghi, Mehdi

    2017-01-01

    The Common topological features of related species gene regulatory networks suggest reconstruction of the network of one species by using the further information from gene expressions profile of related species. We present an algorithm to reconstruct the gene regulatory network named; F-MAP, which applies the knowledge about gene interactions from related species. Our algorithm sets a Bayesian framework to estimate the precision matrix of one species microarray gene expressions dataset to infer the Gaussian Graphical model of the network. The conjugate Wishart prior is used and the information from related species is applied to estimate the hyperparameters of the prior distribution by using the factor analysis. Applying the proposed algorithm on six related species of drosophila shows that the precision of reconstructed networks is improved considerably compared to the precision of networks constructed by other Bayesian approaches.

  18. Evaluating a Bayesian approach to improve accuracy of individual photographic identification methods using ecological distribution data

    Directory of Open Access Journals (Sweden)

    Richard Stafford

    2011-04-01

    Full Text Available Photographic identification of individual organisms can be possible from natural body markings. Data from photo-ID can be used to estimate important ecological and conservation metrics such as population sizes, home ranges or territories. However, poor quality photographs or less well-studied individuals can result in a non-unique ID, potentially confounding several similar looking individuals. Here we present a Bayesian approach that uses known data about previous sightings of individuals at specific sites as priors to help assess the problems of obtaining a non-unique ID. Using a simulation of individuals with different confidence of correct ID we evaluate the accuracy of Bayesian modified (posterior probabilities. However, in most cases, the accuracy of identification decreases. Although this technique is unsuccessful, it does demonstrate the importance of computer simulations in testing such hypotheses in ecology.

  19. Co-morbid obsessive-compulsive disorder and depression: a Bayesian network approach.

    Science.gov (United States)

    McNally, R J; Mair, P; Mugno, B L; Riemann, B C

    2017-05-01

    Obsessive-compulsive disorder (OCD) is often co-morbid with depression. Using the methods of network analysis, we computed two networks that disclose the potentially causal relationships among symptoms of these two disorders in 408 adult patients with primary OCD and co-morbid depression symptoms. We examined the relationship between the symptoms constituting these syndromes by computing a (regularized) partial correlation network via the graphical LASSO procedure, and a directed acyclic graph (DAG) via a Bayesian hill-climbing algorithm. The results suggest that the degree of interference and distress associated with obsessions, and the degree of interference associated with compulsions, are the chief drivers of co-morbidity. Moreover, activation of the depression cluster appears to occur solely through distress associated with obsessions activating sadness - a key symptom that 'bridges' the two syndromic clusters in the DAG. Bayesian analysis can expand the repertoire of network analytic approaches to psychopathology. We discuss clinical implications and limitations of our findings.

  20. Hierarchical organization of functional connectivity in the mouse brain: a complex network approach.

    Science.gov (United States)

    Bardella, Giampiero; Bifone, Angelo; Gabrielli, Andrea; Gozzi, Alessandro; Squartini, Tiziano

    2016-08-18

    This paper represents a contribution to the study of the brain functional connectivity from the perspective of complex networks theory. More specifically, we apply graph theoretical analyses to provide evidence of the modular structure of the mouse brain and to shed light on its hierarchical organization. We propose a novel percolation analysis and we apply our approach to the analysis of a resting-state functional MRI data set from 41 mice. This approach reveals a robust hierarchical structure of modules persistent across different subjects. Importantly, we test this approach against a statistical benchmark (or null model) which constrains only the distributions of empirical correlations. Our results unambiguously show that the hierarchical character of the mouse brain modular structure is not trivially encoded into this lower-order constraint. Finally, we investigate the modular structure of the mouse brain by computing the Minimal Spanning Forest, a technique that identifies subnetworks characterized by the strongest internal correlations. This approach represents a faster alternative to other community detection methods and provides a means to rank modules on the basis of the strength of their internal edges.

  1. Hierarchical approach to optimization of parallel matrix multiplication on large-scale platforms

    KAUST Repository

    Hasanov, Khalid

    2014-03-04

    © 2014, Springer Science+Business Media New York. Many state-of-the-art parallel algorithms, which are widely used in scientific applications executed on high-end computing systems, were designed in the twentieth century with relatively small-scale parallelism in mind. Indeed, while in 1990s a system with few hundred cores was considered a powerful supercomputer, modern top supercomputers have millions of cores. In this paper, we present a hierarchical approach to optimization of message-passing parallel algorithms for execution on large-scale distributed-memory systems. The idea is to reduce the communication cost by introducing hierarchy and hence more parallelism in the communication scheme. We apply this approach to SUMMA, the state-of-the-art parallel algorithm for matrix–matrix multiplication, and demonstrate both theoretically and experimentally that the modified Hierarchical SUMMA significantly improves the communication cost and the overall performance on large-scale platforms.

  2. A Hierarchical FEM approach for Simulation of Geometrical and Material induced Instability of Composite Structures

    DEFF Research Database (Denmark)

    Hansen, Anders L.; Lund, Erik; Pinho, Silvestre T.

    2009-01-01

    In this paper a hierarchical FE approach is utilized to simulate delamination in a composite plate loaded in uni-axial compression. Progressive delamination is modelled by use of cohesive interface elements that are automatically embedded. The non-linear problem is solved quasi-statically in which...... the interaction between material degradation and structural instability is solved iteratively. The effect of fibre bridging is studied numerically and in-plane failure is predicted using physically based failure criteria....

  3. Establishing a Bayesian approach to determining cosmogenic nuclide reference production rates using He-3

    Science.gov (United States)

    Goehring, Brent M.; Muzikar, Paul; Lifton, Nathaniel A.

    2018-01-01

    Production rates are a cornerstone of in situ cosmogenic nuclide applications, including surface exposure dating, erosion rate/denudation rate estimates, and burial dating. The most common approach for estimating production rates is to measure cosmogenic nuclide samples from sites with independently well-constrained exposure histories. In addition, while researchers attempt to minimize the effects of erosion through careful site and sample selection, it can be present at some unknown level in certain sites. We present a general Bayesian methodology for combining information from the nuclide concentrations, the exposure history, and the possibility of erosion, to determine the production rate at a given site. Then, we use another Bayesian approach to combine the results from the various sites. Cosmogenic 3He is an ideal test-bed for our Bayesian approach. It has the most calibration sites of the commonly measured cosmogenic nuclides, and there is evidence for the effect of erosion on some of the sites. Our approach largely reconciles previous discrepancies between sites of widely varying age, even at latitudes where geomagnetic effects are significant. With the canonical Lal/Stone scaling scheme, we derive a global sea level high latitude 3He production rate of 118 ± 2 atoms g-1 yr-1 when considering olivine and pyroxene together. Using the Lifton-Sato-Dunai scaling scheme yields a similar rate of 121 ± 2 atoms g-1 yr-1. Uncertainties associated with these values are improved over previous studies, due to both reduced scatter among the sites and an approach to combining sites which deemphasizes outliers.

  4. Energy Efficient Hierarchical Clustering Approaches in Wireless Sensor Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Bilal Jan

    2017-01-01

    Full Text Available Wireless sensor networks (WSN are one of the significant technologies due to their diverse applications such as health care monitoring, smart phones, military, disaster management, and other surveillance systems. Sensor nodes are usually deployed in large number that work independently in unattended harsh environments. Due to constraint resources, typically the scarce battery power, these wireless nodes are grouped into clusters for energy efficient communication. In clustering hierarchical schemes have achieved great interest for minimizing energy consumption. Hierarchical schemes are generally categorized as cluster-based and grid-based approaches. In cluster-based approaches, nodes are grouped into clusters, where a resourceful sensor node is nominated as a cluster head (CH while in grid-based approach the network is divided into confined virtual grids usually performed by the base station. This paper highlights and discusses the design challenges for cluster-based schemes, the important cluster formation parameters, and classification of hierarchical clustering protocols. Moreover, existing cluster-based and grid-based techniques are evaluated by considering certain parameters to help users in selecting appropriate technique. Furthermore, a detailed summary of these protocols is presented with their advantages, disadvantages, and applicability in particular cases.

  5. CSI: a nonparametric Bayesian approach to network inference from multiple perturbed time series gene expression data.

    Science.gov (United States)

    Penfold, Christopher A; Shifaz, Ahmed; Brown, Paul E; Nicholson, Ann; Wild, David L

    2015-06-01

    Here we introduce the causal structure identification (CSI) package, a Gaussian process based approach to inferring gene regulatory networks (GRNs) from multiple time series data. The standard CSI approach infers a single GRN via joint learning from multiple time series datasets; the hierarchical approach (HCSI) infers a separate GRN for each dataset, albeit with the networks constrained to favor similar structures, allowing for the identification of context specific networks. The software is implemented in MATLAB and includes a graphical user interface (GUI) for user friendly inference. Finally the GUI can be connected to high performance computer clusters to facilitate analysis of large genomic datasets.

  6. A two-level multimodality imaging Bayesian network approach for classification of partial epilepsy: preliminary data.

    Science.gov (United States)

    Mueller, Susanne G; Young, Karl; Hartig, Miriam; Barakos, Jerome; Garcia, Paul; Laxer, Kenneth D

    2013-05-01

    Quantitative neuroimaging analyses have demonstrated gray and white matter abnormalities in group comparisons of different types of non-lesional partial epilepsy. It is unknown to what degree these type-specific patterns exist in individual patients and if they could be exploited for diagnostic purposes. In this study, a two-level multi-modality imaging Bayesian network approach is proposed that uses information about individual gray matter volume loss and white matter integrity to classify non-lesional temporal lobe epilepsy with (TLE-MTS) and without (TLE-no) mesial-temporal sclerosis and frontal lobe epilepsy (FLE). 25 controls, 19 TLE-MTS, 22 TLE-no and 14 FLE were studied on a 4T MRI and T1 weighted structural and DTI images acquired. Spatially normalized gray matter (GM) and fractional anisotropy (FA) abnormality maps (binary maps with voxels 1 SD below control mean) were calculated for each subject. At the first level, each group's abnormality maps were compared with those from all the other groups using Graphical-Model-based Morphometric Analysis (GAMMA). GAMMA uses a Bayesian network and a Markov random field based contextual clustering method to produce maps of voxels that provide the maximal distinction between two groups and calculates a probability distribution and a group assignment based on this information. The information was then combined in a second level Bayesian network and the probability of each subject to belong to one of the three epilepsy types calculated. The specificities of the two level Bayesian network to distinguish between the three patient groups were 0.87 for TLE-MTS and TLE-no and 0.86 for FLE, the corresponding sensitivities were 0.84 for TLE-MTS, 0.72 for TLE-no and 0.64 for FLE. The two-level multi-modality Bayesian network approach was able to distinguish between the three epilepsy types with a reasonably high accuracy even though the majority of the images were completely normal on visual inspection. Copyright © 2013

  7. Capturing changes in flood risk with Bayesian approaches for flood damage assessment

    Science.gov (United States)

    Vogel, Kristin; Schröter, Kai; Kreibich, Heidi; Thieken, Annegret; Müller, Meike; Sieg, Tobias; Laudan, Jonas; Kienzler, Sarah; Weise, Laura; Merz, Bruno; Scherbaum, Frank

    2016-04-01

    Flood risk is a function of hazard as well as of exposure and vulnerability. All three components are under change over space and time and have to be considered for reliable damage estimations and risk analyses, since this is the basis for an efficient, adaptable risk management. Hitherto, models for estimating flood damage are comparatively simple and cannot sufficiently account for changing conditions. The Bayesian network approach allows for a multivariate modeling of complex systems without relying on expert knowledge about physical constraints. In a Bayesian network each model component is considered to be a random variable. The way of interactions between those variables can be learned from observations or be defined by expert knowledge. Even a combination of both is possible. Moreover, the probabilistic framework captures uncertainties related to the prediction and provides a probability distribution for the damage instead of a point estimate. The graphical representation of Bayesian networks helps to study the change of probabilities for changing circumstances and may thus simplify the communication between scientists and public authorities. In the framework of the DFG-Research Training Group "NatRiskChange" we aim to develop Bayesian networks for flood damage and vulnerability assessments of residential buildings and companies under changing conditions. A Bayesian network learned from data, collected over the last 15 years in flooded regions in the Elbe and Danube catchments (Germany), reveals the impact of many variables like building characteristics, precaution and warning situation on flood damage to residential buildings. While the handling of incomplete and hybrid (discrete mixed with continuous) data are the most challenging issues in the study on residential buildings, a similar study, that focuses on the vulnerability of small to medium sized companies, bears new challenges. Relying on a much smaller data set for the determination of the model

  8. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  9. A Hierarchical and Distributed Approach for Mapping Large Applications to Heterogeneous Grids using Genetic Algorithms

    Science.gov (United States)

    Sanyal, Soumya; Jain, Amit; Das, Sajal K.; Biswas, Rupak

    2003-01-01

    In this paper, we propose a distributed approach for mapping a single large application to a heterogeneous grid environment. To minimize the execution time of the parallel application, we distribute the mapping overhead to the available nodes of the grid. This approach not only provides a fast mapping of tasks to resources but is also scalable. We adopt a hierarchical grid model and accomplish the job of mapping tasks to this topology using a scheduler tree. Results show that our three-phase algorithm provides high quality mappings, and is fast and scalable.

  10. True versus apparent malaria infection prevalence: the contribution of a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Niko Speybroeck

    Full Text Available AIMS: To present a new approach for estimating the "true prevalence" of malaria and apply it to datasets from Peru, Vietnam, and Cambodia. METHODS: Bayesian models were developed for estimating both the malaria prevalence using different diagnostic tests (microscopy, PCR & ELISA, without the need of a gold standard, and the tests' characteristics. Several sources of information, i.e. data, expert opinions and other sources of knowledge can be integrated into the model. This approach resulting in an optimal and harmonized estimate of malaria infection prevalence, with no conflict between the different sources of information, was tested on data from Peru, Vietnam and Cambodia. RESULTS: Malaria sero-prevalence was relatively low in all sites, with ELISA showing the highest estimates. The sensitivity of microscopy and ELISA were statistically lower in Vietnam than in the other sites. Similarly, the specificities of microscopy, ELISA and PCR were significantly lower in Vietnam than in the other sites. In Vietnam and Peru, microscopy was closer to the "true" estimate than the other 2 tests while as expected ELISA, with its lower specificity, usually overestimated the prevalence. CONCLUSIONS: Bayesian methods are useful for analyzing prevalence results when no gold standard diagnostic test is available. Though some results are expected, e.g. PCR more sensitive than microscopy, a standardized and context-independent quantification of the diagnostic tests' characteristics (sensitivity and specificity and the underlying malaria prevalence may be useful for comparing different sites. Indeed, the use of a single diagnostic technique could strongly bias the prevalence estimation. This limitation can be circumvented by using a Bayesian framework taking into account the imperfect characteristics of the currently available diagnostic tests. As discussed in the paper, this approach may further support global malaria burden estimation initiatives.

  11. Predicting downturns in the US housing market: a Bayesian approach [Conference presentation

    CSIR Research Space (South Africa)

    Gupta, R

    2008-10-01

    Full Text Available stream_source_info Gupta1_2008.pdf.txt stream_content_type text/plain stream_size 13351 Content-Encoding ISO-8859-1 stream_name Gupta1_2008.pdf.txt Content-Type text/plain; charset=ISO-8859-1 Background and Motivation... Models - VARs, BVARs and SBVARs Forecasting House Prices in the Twenty Largest US States Results Predicting the Downturns PREDICTING DOWNTURNS IN THE US HOUSING MARKET: A BAYESIAN APPROACH Rangan Gupta1 and Sonali Das2 1Associate Professor...

  12. A Bayesian approach to probabilistic ecological risk assessment: risk comparison of nine toxic substances in Tokyo surface waters.

    Science.gov (United States)

    Hayashi, Takehiko I; Kashiwagi, Nobuhisa

    2011-03-01

    Quantitative risk comparison of toxic substances is necessary to decide which substances should be prioritized to achieve effective risk management. This study compared the ecological risk among nine major toxic substances (ammonia, bisphenol-A, chloroform, copper, hexavalent chromium, lead, manganese, nickel, and zinc) in Tokyo surface waters by adopting an integrated risk analysis procedure using Bayesian statistics. Species sensitivity distributions of these substances were derived by using four Bayesian models. Environmental concentration distributions were derived by a hierarchical Bayesian model that explicitly considered the differences between within-site and between-site variations in environmental concentrations. Medians and confidence intervals of the expected potentially affected fraction (EPAF) of species were then computed by the Monte Carlo method. The estimated EPAF values suggested that risk from nickel was highest and risk from zinc and ammonia were also high relative to other substances. The risk from copper was highest if bioavailability was not considered, although toxicity correction by a biotic ligand model greatly reduced the estimated risk. The risk from manganese was highest if a conservative risk index estimate (90% upper EPAF confidence limit) was selected. It is suggested that zinc is not a predominant risk factor in Tokyo surface waters and strategic efforts are required to reduce the total ecological risk from multiple substances. The presented risk analysis procedure using EPAF and Bayesian statistics is expected to advance methodologies and practices in quantitative ecological risk comparison.

  13. A new approach for supply chain risk management: Mapping SCOR into Bayesian network

    Directory of Open Access Journals (Sweden)

    Mahdi Abolghasemi

    2015-01-01

    Full Text Available Purpose: Increase of costs and complexities in organizations beside the increase of uncertainty and risks have led the managers to use the risk management in order to decrease risk taking and deviation from goals. SCRM has a close relationship with supply chain performance. During the years different methods have been used by researchers in order to manage supply chain risk but most of them are either qualitative or quantitative. Supply chain operation reference (SCOR is a standard model for SCP evaluation which have uncertainty in its metrics. In This paper by combining qualitative and quantitative metrics of SCOR, supply chain performance will be measured by Bayesian Networks. Design/methodology/approach: First qualitative assessment will be done by recognizing uncertain metrics of SCOR model and then by quantifying them, supply chain performance will be measured by Bayesian Networks (BNs and supply chain operations reference (SCOR in which making decision on uncertain variables will be done by predictive and diagnostic capabilities. Findings: After applying the proposed method in one of the biggest automotive companies in Iran, we identified key factors of supply chain performance based on SCOR model through predictive and diagnostic capability of Bayesian Networks. After sensitivity analysis, we find out that ‘Total cost’ and its criteria that include costs of labors, warranty, transportation and inventory have the widest range and most effect on supply chain performance. So, managers should take their importance into account for decision making. We can make decisions simply by running model in different situations. Research limitations/implications: A more precise model consisted of numerous factors but it is difficult and sometimes impossible to solve big models, if we insert all of them in a Bayesian model. We have adopted real world characteristics with our software and method abilities. On the other hand, fewer data exist for some

  14. Uncertainty estimation of a complex water quality model: The influence of Box-Cox transformation on Bayesian approaches and comparison with a non-Bayesian method

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio

    In urban drainage modelling, uncertainty analysis is of undoubted necessity. However, uncertainty analysis in urban water-quality modelling is still in its infancy and only few studies have been carried out. Therefore, several methodological aspects still need to be experienced and clarified especially regarding water quality modelling. The use of the Bayesian approach for uncertainty analysis has been stimulated by its rigorous theoretical framework and by the possibility of evaluating the impact of new knowledge on the modelling predictions. Nevertheless, the Bayesian approach relies on some restrictive hypotheses that are not present in less formal methods like the Generalised Likelihood Uncertainty Estimation (GLUE). One crucial point in the application of Bayesian method is the formulation of a likelihood function that is conditioned by the hypotheses made regarding model residuals. Statistical transformations, such as the use of Box-Cox equation, are generally used to ensure the homoscedasticity of residuals. However, this practice may affect the reliability of the analysis leading to a wrong uncertainty estimation. The present paper aims to explore the influence of the Box-Cox equation for environmental water quality models. To this end, five cases were considered one of which was the “real” residuals distributions (i.e. drawn from available data). The analysis was applied to the Nocella experimental catchment (Italy) which is an agricultural and semi-urbanised basin where two sewer systems, two wastewater treatment plants and a river reach were monitored during both dry and wet weather periods. The results show that the uncertainty estimation is greatly affected by residual transformation and a wrong assumption may also affect the evaluation of model uncertainty. The use of less formal methods always provide an overestimation of modelling uncertainty with respect to Bayesian method but such effect is reduced if a wrong assumption is made regarding the

  15. An urban flood risk assessment method using the Bayesian Network approach

    DEFF Research Database (Denmark)

    Åström, Helena Lisa Alexandra

    the Bayesian Network (BN) approach is developed, and the method is exemplified in an urban catchment. BNs have become an increasingly popular method for describing complex systems and aiding decision-making under uncertainty. In environmental management, BNs have mainly been utilized in ecological assessments...... circulation influences local and regional climate and is considered an important factor when aiming at improving our understanding of local weather conditions and the occurrence of extreme events. Hence, this thesis presents a study that explores the relationship between flood generating hazards and large......-scale atmospheric circulation. This thesis concludes that IDs can serve as a good approach for describing the complex system in which flood risk occurs. The final product is a spatiotemporal FRA approach that can include the impacts from multiple hazards....

  16. Hierarchical Agent-Based Integrated Modelling Approach for Microgrids with Adoption of EVs and HRES

    Directory of Open Access Journals (Sweden)

    Peng Han

    2014-01-01

    Full Text Available The large adoption of electric vehicles (EVs, hybrid renewable energy systems (HRESs, and the increasing of the loads shall bring significant challenges to the microgrid. The methodology to model microgrid with high EVs and HRESs penetrations is the key to EVs adoption assessment and optimized HRESs deployment. However, considering the complex interactions of the microgrid containing massive EVs and HRESs, any previous single modelling approaches are insufficient. Therefore in this paper, the methodology named Hierarchical Agent-based Integrated Modelling Approach (HAIMA is proposed. With the effective integration of the agent-based modelling with other advanced modelling approaches, the proposed approach theoretically contributes to a new microgrid model hierarchically constituted by microgrid management layer, component layer, and event layer. Then the HAIMA further links the key parameters and interconnects them to achieve the interactions of the whole model. Furthermore, HAIMA practically contributes to a comprehensive microgrid operation system, through which the assessment of the proposed model and the impact of the EVs adoption are achieved. Simulations show that the proposed HAIMA methodology will be beneficial for the microgrid study and EV’s operation assessment and shall be further utilized for the energy management, electricity consumption prediction, the EV scheduling control, and HRES deployment optimization.

  17. Construction of a Hierarchical Architecture of Covalent Organic Frameworks via a Postsynthetic Approach.

    Science.gov (United States)

    Zhang, Gen; Tsujimoto, Masahiko; Packwood, Daniel; Duong, Nghia Tuan; Nishiyama, Yusuke; Kadota, Kentaro; Kitagawa, Susumu; Horike, Satoshi

    2018-02-21

    Covalent organic frameworks (COFs) represent an emerging class of crystalline porous materials that are constructed by the assembly of organic building blocks linked via covalent bonds. Several strategies have been developed for the construction of new COF structures; however, a facile approach to fabricate hierarchical COF architectures with controlled domain structures remains a significant challenge, and has not yet been achieved. In this study, a dynamic covalent chemistry (DCC)-based postsynthetic approach was employed at the solid-liquid interface to construct such structures. Two-dimensional imine-bonded COFs having different aromatic groups were prepared, and a homogeneously mixed-linker structure and a heterogeneously core-shell hollow structure were fabricated by controlling the reactivity of the postsynthetic reactions. Solid-state nuclear magnetic resonance (NMR) spectroscopy and transmission electron microscopy (TEM) confirmed the structures. COFs prepared by a postsynthetic approach exhibit several functional advantages compared with their parent phases. Their Brunauer-Emmett-Teller (BET) surface areas are 2-fold greater than those of their parent phases because of the higher crystallinity. In addition, the hydrophilicity of the material and the stepwise adsorption isotherms of H 2 O vapor in the hierarchical frameworks were precisely controlled, which was feasible because of the distribution of various domains of the two COFs by controlling the postsynthetic reaction. The approach opens new routes for constructing COF architectures with functionalities that are not possible in a single phase.

  18. Prognostic factors for urachal cancer: a bayesian model-averaging approach.

    Science.gov (United States)

    Kim, In Kyong; Lee, Joo Yong; Kwon, Jong Kyou; Park, Jae Joon; Cho, Kang Su; Ham, Won Sik; Hong, Sung Joon; Yang, Seung Choul; Choi, Young Deuk

    2014-09-01

    This study was conducted to evaluate prognostic factors and cancer-specific survival (CSS) in a cohort of 41 patients with urachal carcinoma by use of a Bayesian model-averaging approach. Our cohort included 41 patients with urachal carcinoma who underwent extended partial cystectomy, total cystectomy, transurethral resection, chemotherapy, or radiotherapy at a single institute. All patients were classified by both the Sheldon and the Mayo staging systems according to histopathologic reports and preoperative radiologic findings. Kaplan-Meier survival curves and Cox proportional-hazards regression models were carried out to investigate prognostic factors, and a Bayesian model-averaging approach was performed to confirm the significance of each variable by using posterior probabilities. The mean age of the patients was 49.88 ± 13.80 years and the male-to-female ratio was 24:17. The median follow-up was 5.42 years (interquartile range, 2.8-8.4 years). Five- and 10-year CSS rates were 55.9% and 43.4%, respectively. Lower Sheldon (p=0.004) and Mayo (pcancer-specific mortality in urachal carcinoma. The Mayo staging system might be more effective than the Sheldon staging system. In addition, the multivariate analyses suggested that tumor size may be a prognostic factor for urachal carcinoma.

  19. A Bayesian approach to sound source reconstruction: optimal basis, regularization, and focusing.

    Science.gov (United States)

    Antoni, Jérôme

    2012-04-01

    The reconstruction of acoustical sources from discrete field measurements is a difficult inverse problem that has been approached in different ways. Classical methods (beamforming, near-field acoustical holography, inverse boundary elements, wave superposition, equivalent sources, etc.) all consist--implicitly or explicitly--in interpolating the measurements onto some spatial functions whose propagation are known and in reconstructing the source field by retropropagation. This raises the fundamental question as whether, for a given source topology and array geometry, there exists an optimal interpolation basis which minimizes the reconstruction error. This paper provides a general answer to this question, by proceeding from a Bayesian formulation that is ideally suited to combining information of physical and probabilistic natures. The main findings are the followings: (1) The optimal basis functions are the M eigen-functions of a specific continuous-discrete propagation operator, with M being the number of microphones in the array. (2) The a priori inclusion of spatial information on the source field causes super-resolution according to a phenomenon coined "Bayesian focusing." (3) The approach is naturally endowed with an internal regularization mechanism and results in a robust regularization criterion with no more than one minimum. (4) It admits classical methods as particular cases.

  20. A Dynamic BI–Orthogonal Field Equation Approach to Efficient Bayesian Inversion

    Directory of Open Access Journals (Sweden)

    Tagade Piyush M.

    2017-06-01

    Full Text Available This paper proposes a novel computationally efficient stochastic spectral projection based approach to Bayesian inversion of a computer simulator with high dimensional parametric and model structure uncertainty. The proposed method is based on the decomposition of the solution into its mean and a random field using a generic Karhunen-Loève expansion. The random field is represented as a convolution of separable Hilbert spaces in stochastic and spatial dimensions that are spectrally represented using respective orthogonal bases. In particular, the present paper investigates generalized polynomial chaos bases for the stochastic dimension and eigenfunction bases for the spatial dimension. Dynamic orthogonality is used to derive closed-form equations for the time evolution of mean, spatial and the stochastic fields. The resultant system of equations consists of a partial differential equation (PDE that defines the dynamic evolution of the mean, a set of PDEs to define the time evolution of eigenfunction bases, while a set of ordinary differential equations (ODEs define dynamics of the stochastic field. This system of dynamic evolution equations efficiently propagates the prior parametric uncertainty to the system response. The resulting bi-orthogonal expansion of the system response is used to reformulate the Bayesian inference for efficient exploration of the posterior distribution. The efficacy of the proposed method is investigated for calibration of a 2D transient diffusion simulator with an uncertain source location and diffusivity. The computational efficiency of the method is demonstrated against a Monte Carlo method and a generalized polynomial chaos approach.

  1. Dimensionality of the 9-item Utrecht Work Engagement Scale revisited: A Bayesian structural equation modeling approach.

    Science.gov (United States)

    Fong, Ted C T; Ho, Rainbow T H

    2015-01-01

    The aim of this study was to reexamine the dimensionality of the widely used 9-item Utrecht Work Engagement Scale using the maximum likelihood (ML) approach and Bayesian structural equation modeling (BSEM) approach. Three measurement models (1-factor, 3-factor, and bi-factor models) were evaluated in two split samples of 1,112 health-care workers using confirmatory factor analysis and BSEM, which specified small-variance informative priors for cross-loadings and residual covariances. Model fit and comparisons were evaluated by posterior predictive p-value (PPP), deviance information criterion, and Bayesian information criterion (BIC). None of the three ML-based models showed an adequate fit to the data. The use of informative priors for cross-loadings did not improve the PPP for the models. The 1-factor BSEM model with approximately zero residual covariances displayed a good fit (PPP>0.10) to both samples and a substantially lower BIC than its 3-factor and bi-factor counterparts. The BSEM results demonstrate empirical support for the 1-factor model as a parsimonious and reasonable representation of work engagement.

  2. A Bayesian state-space approach for damage detection and classification

    Science.gov (United States)

    Dzunic, Zoran; Chen, Justin G.; Mobahi, Hossein; Büyüköztürk, Oral; Fisher, John W.

    2017-11-01

    The problem of automatic damage detection in civil structures is complex and requires a system that can interpret collected sensor data into meaningful information. We apply our recently developed switching Bayesian model for dependency analysis to the problems of damage detection and classification. The model relies on a state-space approach that accounts for noisy measurement processes and missing data, which also infers the statistical temporal dependency between measurement locations signifying the potential flow of information within the structure. A Gibbs sampling algorithm is used to simultaneously infer the latent states, parameters of the state dynamics, the dependence graph, and any changes in behavior. By employing a fully Bayesian approach, we are able to characterize uncertainty in these variables via their posterior distribution and provide probabilistic estimates of the occurrence of damage or a specific damage scenario. We also implement a single class classification method which is more realistic for most real world situations where training data for a damaged structure is not available. We demonstrate the methodology with experimental test data from a laboratory model structure and accelerometer data from a real world structure during different environmental and excitation conditions.

  3. How to interpret the results of medical time series data analysis: Classical statistical approaches versus dynamic Bayesian network modeling.

    Science.gov (United States)

    Onisko, Agnieszka; Druzdzel, Marek J; Austin, R Marshall

    2016-01-01

    Classical statistics is a well-established approach in the analysis of medical data. While the medical community seems to be familiar with the concept of a statistical analysis and its interpretation, the Bayesian approach, argued by many of its proponents to be superior to the classical frequentist approach, is still not well-recognized in the analysis of medical data. The goal of this study is to encourage data analysts to use the Bayesian approach, such as modeling with graphical probabilistic networks, as an insightful alternative to classical statistical analysis of medical data. This paper offers a comparison of two approaches to analysis of medical time series data: (1) classical statistical approach, such as the Kaplan-Meier estimator and the Cox proportional hazards regression model, and (2) dynamic Bayesian network modeling. Our comparison is based on time series cervical cancer screening data collected at Magee-Womens Hospital, University of Pittsburgh Medical Center over 10 years. The main outcomes of our comparison are cervical cancer risk assessments produced by the three approaches. However, our analysis discusses also several aspects of the comparison, such as modeling assumptions, model building, dealing with incomplete data, individualized risk assessment, results interpretation, and model validation. Our study shows that the Bayesian approach is (1) much more flexible in terms of modeling effort, and (2) it offers an individualized risk assessment, which is more cumbersome for classical statistical approaches.

  4. Naturalizing sense of agency with a hierarchical event-control approach.

    Directory of Open Access Journals (Sweden)

    Devpriya Kumar

    Full Text Available Unraveling the mechanisms underlying self and agency has been a difficult scientific problem. We argue for an event-control approach for naturalizing the sense of agency by focusing on the role of perception-action regularities present at different hierarchical levels and contributing to the sense of self as an agent. The amount of control at different levels of the control hierarchy determines the sense of agency. The current study investigates this approach in a set of two experiments using a scenario containing multiple agents sharing a common goal where one of the agents is partially controlled by the participant. The participant competed with other agents for achieving the goal and subsequently answered questions on identification (which agent was controlled by the participant, the degree to which they are confident about their identification (sense of identification and the degree to which the participant believed he/she had control over his/her actions (sense of authorship. Results indicate a hierarchical relationship between goal-level control (higher level and perceptual-motor control (lower level for sense of agency. Sense of identification ratings increased with perceptual-motor control when the goal was not completed but did not vary with perceptual-motor control when the goal was completed. Sense of authorship showed a similar interaction effect only in experiment 2 that had only one competing agent unlike the larger number of competing agents in experiment 1. The effect of hierarchical control can also be seen in the misidentification pattern and misidentification was greater with the agent affording greater control. Results from the two studies support the event-control approach in understanding sense of agency as grounded in control. The study also offers a novel paradigm for empirically studying sense of agency and self.

  5. Précis of bayesian rationality: The probabilistic approach to human reasoning.

    Science.gov (United States)

    Oaksford, Mike; Chater, Nick

    2009-02-01

    According to Aristotle, humans are the rational animal. The borderline between rationality and irrationality is fundamental to many aspects of human life including the law, mental health, and language interpretation. But what is it to be rational? One answer, deeply embedded in the Western intellectual tradition since ancient Greece, is that rationality concerns reasoning according to the rules of logic--the formal theory that specifies the inferential connections that hold with certainty between propositions. Piaget viewed logical reasoning as defining the end-point of cognitive development; and contemporary psychology of reasoning has focussed on comparing human reasoning against logical standards. Bayesian Rationality argues that rationality is defined instead by the ability to reason about uncertainty. Although people are typically poor at numerical reasoning about probability, human thought is sensitive to subtle patterns of qualitative Bayesian, probabilistic reasoning. In Chapters 1-4 of Bayesian Rationality (Oaksford & Chater 2007), the case is made that cognition in general, and human everyday reasoning in particular, is best viewed as solving probabilistic, rather than logical, inference problems. In Chapters 5-7 the psychology of "deductive" reasoning is tackled head-on: It is argued that purportedly "logical" reasoning problems, revealing apparently irrational behaviour, are better understood from a probabilistic point of view. Data from conditional reasoning, Wason's selection task, and syllogistic inference are captured by recasting these problems probabilistically. The probabilistic approach makes a variety of novel predictions which have been experimentally confirmed. The book considers the implications of this work, and the wider "probabilistic turn" in cognitive science and artificial intelligence, for understanding human rationality.

  6. A BAYESIAN APPROACH TO DERIVING AGES OF INDIVIDUAL FIELD WHITE DWARFS

    Energy Technology Data Exchange (ETDEWEB)

    O' Malley, Erin M. [Department of Physics and Astronomy, Siena College, Loudonville, NY 12211 (United States); Von Hippel, Ted [Department of Physical Sciences, Embry-Riddle Aeronautical University, Daytona Beach, FL 32114 (United States); Van Dyk, David A., E-mail: ted.vonhippel@erau.edu, E-mail: dvandyke@imperial.ac.uk [Statistics Section, Department of Mathematics, Imperial College London, SW7 2AZ (United Kingdom)

    2013-09-20

    We apply a self-consistent and robust Bayesian statistical approach to determine the ages, distances, and zero-age main sequence (ZAMS) masses of 28 field DA white dwarfs (WDs) with ages of approximately 4-8 Gyr. Our technique requires only quality optical and near-infrared photometry to derive ages with <15% uncertainties, generally with little sensitivity to our choice of modern initial-final mass relation. We find that age, distance, and ZAMS mass are correlated in a manner that is too complex to be captured by traditional error propagation techniques. We further find that the posterior distributions of age are often asymmetric, indicating that the standard approach to deriving WD ages can yield misleading results.

  7. A novel critical infrastructure resilience assessment approach using dynamic Bayesian networks

    Science.gov (United States)

    Cai, Baoping; Xie, Min; Liu, Yonghong; Liu, Yiliu; Ji, Renjie; Feng, Qiang

    2017-10-01

    The word resilience originally originates from the Latin word "resiliere", which means to "bounce back". The concept has been used in various fields, such as ecology, economics, psychology, and society, with different definitions. In the field of critical infrastructure, although some resilience metrics are proposed, they are totally different from each other, which are determined by the performances of the objects of evaluation. Here we bridge the gap by developing a universal critical infrastructure resilience metric from the perspective of reliability engineering. A dynamic Bayesian networks-based assessment approach is proposed to calculate the resilience value. A series, parallel and voting system is used to demonstrate the application of the developed resilience metric and assessment approach.

  8. A Bayesian approach to characterising multi-phase flows using magnetic resonance: application to bubble flows.

    Science.gov (United States)

    Holland, D J; Blake, A; Tayler, A B; Sederman, A J; Gladden, L F

    2011-03-01

    Magnetic Resonance (MR) imaging is difficult to apply to multi-phase flows due to both the inherently short T₂* characterising such systems and the relatively long time taken to acquire the data. We develop a Bayesian MR approach for analysing data in k-space that eliminates the need for image acquisition, thereby significantly extending the range of systems that can be studied. We demonstrate the technique by measuring bubble size distributions in gas-liquid flows. The MR approach is compared with an optical technique at a low gas fraction (∼2%), before being applied to a system where the gas fraction is too high for optical measurements (∼15%). Copyright © 2010 Elsevier Inc. All rights reserved.

  9. Bayesian adaptive approach to estimating sample sizes for seizures of illicit drugs.

    Science.gov (United States)

    Moroni, Rossana; Aalberg, Laura; Reinikainen, Tapani; Corander, Jukka

    2012-01-01

    A considerable amount of discussion can be found in the forensics literature about the issue of using statistical sampling to obtain for chemical analyses an appropriate subset of units from a police seizure suspected to contain illicit material. Use of the Bayesian paradigm has been suggested as the most suitable statistical approach to solving the question of how large a sample needs to be to ensure legally and practically acceptable purposes. Here, we introduce a hypergeometric sampling model combined with a specific prior distribution for the homogeneity of the seizure, where a parameter for the analyst's expectation of homogeneity (α) is included. Our results show how an adaptive approach to sampling can minimize the practical efforts needed in the laboratory analyses, as the model allows the scientist to decide sequentially how to proceed, while maintaining a sufficiently high confidence in the conclusions. © 2011 American Academy of Forensic Sciences.

  10. Modelling household finances: A Bayesian approach to a multivariate two-part model.

    Science.gov (United States)

    Brown, Sarah; Ghosh, Pulak; Su, Li; Taylor, Karl

    2015-09-01

    We contribute to the empirical literature on household finances by introducing a Bayesian multivariate two-part model, which has been developed to further our understanding of household finances. Our flexible approach allows for the potential interdependence between the holding of assets and liabilities at the household level and also encompasses a two-part process to allow for differences in the influences on asset or liability holding and on the respective amounts held. Furthermore, the framework is dynamic in order to allow for persistence in household finances over time. Our findings endorse the joint modelling approach and provide evidence supporting the importance of dynamics. In addition, we find that certain independent variables exert different influences on the binary and continuous parts of the model thereby highlighting the flexibility of our framework and revealing a detailed picture of the nature of household finances.

  11. The Hierarchical Distributed Agent Based Approach to a Modern Data Center Management

    Directory of Open Access Journals (Sweden)

    Gavrilov Andrey

    2017-01-01

    Full Text Available This paper overviews and analyzes progressive trends in modern data center, and existing solutions to build the distributed cloud data center. Authors present the hierarchical distributed agent-based control plane architecture to build a web-scale control layer based on software-defined domains. The goal of this approach is to design the simple extensible agent that could be used for any management purposes, just by adding some specific code. Using of this approach makes easier the scalability and increases the efficiency of the management of a multi-site environment. There are five main use-cases of using this approach: distributed cloud, hybrid cloud, hyperscale data center, IoT, Continuous Integration.

  12. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  13. Integrated survival analysis using an event-time approach in a Bayesian framework

    Science.gov (United States)

    Walsh, Daniel P.; Dreitz, VJ; Heisey, Dennis M.

    2015-01-01

    Event-time or continuous-time statistical approaches have been applied throughout the biostatistical literature and have led to numerous scientific advances. However, these techniques have traditionally relied on knowing failure times. This has limited application of these analyses, particularly, within the ecological field where fates of marked animals may be unknown. To address these limitations, we developed an integrated approach within a Bayesian framework to estimate hazard rates in the face of unknown fates. We combine failure/survival times from individuals whose fates are known and times of which are interval-censored with information from those whose fates are unknown, and model the process of detecting animals with unknown fates. This provides the foundation for our integrated model and permits necessary parameter estimation. We provide the Bayesian model, its derivation, and use simulation techniques to investigate the properties and performance of our approach under several scenarios. Lastly, we apply our estimation technique using a piece-wise constant hazard function to investigate the effects of year, age, chick size and sex, sex of the tending adult, and nesting habitat on mortality hazard rates of the endangered mountain plover (Charadrius montanus) chicks. Traditional models were inappropriate for this analysis because fates of some individual chicks were unknown due to failed radio transmitters. Simulations revealed biases of posterior mean estimates were minimal (≤ 4.95%), and posterior distributions behaved as expected with RMSE of the estimates decreasing as sample sizes, detection probability, and survival increased. We determined mortality hazard rates for plover chicks were highest at weights and/or whose nest was within agricultural habitats. Based on its performance, our approach greatly expands the range of problems for which event-time analyses can be used by eliminating the need for having completely known fate data.

  14. A unifying probabilistic Bayesian approach to derive electron density from MRI for radiation therapy treatment planning.

    Science.gov (United States)

    Gudur, Madhu Sudhan Reddy; Hara, Wendy; Le, Quynh-Thu; Wang, Lei; Xing, Lei; Li, Ruijiang

    2014-11-07

    MRI significantly improves the accuracy and reliability of target delineation in radiation therapy for certain tumors due to its superior soft tissue contrast compared to CT. A treatment planning process with MRI as the sole imaging modality will eliminate systematic CT/MRI co-registration errors, reduce cost and radiation exposure, and simplify clinical workflow. However, MRI lacks the key electron density information necessary for accurate dose calculation and generating reference images for patient setup. The purpose of this work is to develop a unifying method to derive electron density from standard T1-weighted MRI. We propose to combine both intensity and geometry information into a unifying probabilistic Bayesian framework for electron density mapping. For each voxel, we compute two conditional probability density functions (PDFs) of electron density given its: (1) T1-weighted MRI intensity, and (2) geometry in a reference anatomy, obtained by deformable image registration between the MRI of the atlas and test patient. The two conditional PDFs containing intensity and geometry information are combined into a unifying posterior PDF, whose mean value corresponds to the optimal electron density value under the mean-square error criterion. We evaluated the algorithm's accuracy of electron density mapping and its ability to detect bone in the head for eight patients, using an additional patient as the atlas or template. Mean absolute HU error between the estimated and true CT, as well as receiver operating characteristics for bone detection (HU > 200) were calculated. The performance was compared with a global intensity approach based on T1 and no density correction (set whole head to water). The proposed technique significantly reduced the errors in electron density estimation, with a mean absolute HU error of 126, compared with 139 for deformable registration (p = 2  ×  10(-4)), 283 for the intensity approach (p = 2  ×  10(-6)) and 282 without density

  15. A Bayesian Approach to Social Structure Uncovers Cryptic Regulation of Group Dynamics in Drosophila melanogaster.

    Science.gov (United States)

    Foley, Brad R; Saltz, Julia B; Nuzhdin, Sergey V; Marjoram, Paul

    2015-06-01

    Understanding the mechanisms that give rise to social structure is central to predicting the evolutionary and ecological outcomes of social interactions. Modeling this process is challenging, because all individuals simultaneously behave in ways that shape their social environments--a process called social niche construction (SNC). In earlier work, we demonstrated that aggression acts as an SNC trait in fruit flies (Drosophila melanogaster), but the mechanisms of that process remained cryptic. Here, we analyze how individual social group preferences generate overall social structure. We use a combination of agent-based simulation and approximate Bayesian computation to fit models to empirical data. We confirm that genetic variation in aggressive behavior influences social group structure. Furthermore, we find that female decamping due to male behavior may play an underappreciated role in structuring social groups. Male-male aggression may sometimes destabilize groups, but it may also be an SNC behavior for shaping desirable groups for females. Density intensifies female social preferences; thus, the role of female behavior in shaping group structure may become more important at high densities. Our ability to model the ontogeny of group structure demonstrates the utility of the Bayesian model-based approach in social behavioral studies.

  16. Estimation of under-reported visceral Leishmaniasis (Vl cases in Bihar: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    A Ranjan

    2013-12-01

    Full Text Available Background: Visceral leishmaniasis (VL is a major health problem in the state of Bihar and adjoining areas in India. In absence of any active surveillance mechanism for the disease, there seems to be gross under-reporting of VL cases. Objective: The objective of this study was to estimate extent of under-reporting of VL cases in Bihar using pooled analysis of published papers. Method: We calculated the pooled common ratio (RRMH based on three studies and combined it with a prior distribution of ratio using inverse-variance weighting method. Bayesian method was used to estimate the posterior distribution of the “under-reporting factor” (ratio of unreported to reported cases. Results: The posterior distribution of ratio of unreported to reported cases yielded a mean of 3.558, with 95% posterior limits of 2.81 and 4.50. Conclusion: Bayesian approach gives evidence to the fact that the total number of VL cases in the state may be nearly more than three times that of currently reported figures. 

  17. Jointly modeling time-to-event and longitudinal data: A Bayesian approach.

    Science.gov (United States)

    Huang, Yangxin; Hu, X Joan; Dagne, Getachew A

    2014-03-01

    This article explores Bayesian joint models of event times and longitudinal measures with an attempt to overcome departures from normality of the longitudinal response, measurement errors, and shortages of confidence in specifying a parametric time-to-event model. We allow the longitudinal response to have a skew distribution in the presence of measurement errors, and assume the time-to-event variable to have a nonparametric prior distribution. Posterior distributions of the parameters are attained simultaneously for inference based on Bayesian approach. An example from a recent AIDS clinical trial illustrates the methodology by jointly modeling the viral dynamics and the time to decrease in CD4/CD8 ratio in the presence of CD4 counts with measurement errors and to compare potential models with various scenarios and different distribution specifications. The analysis outcome indicates that the time-varying CD4 covariate is closely related to the first-phase viral decay rate, but the time to CD4/CD8 decrease is not highly associated with either the two viral decay rates or the CD4 changing rate over time. These findings may provide some quantitative guidance to better understand the relationship of the virological and immunological responses to antiretroviral treatments.

  18. Assessment of successful smoking cessation by psychological factors using the Bayesian network approach.

    Science.gov (United States)

    Yang, Xiaorong; Li, Suyun; Pan, Lulu; Wang, Qiang; Li, Huijie; Han, Mingkui; Zhang, Nan; Jiang, Fan; Jia, Chongqi

    2016-07-01

    The association between psychological factors and smoking cessation is complicated and inconsistent in published researches, and the joint effect of psychological factors on smoking cessation is unclear. This study explored how psychological factors jointly affect the success of smoking cessation using a Bayesian network approach. A community-based case control study was designed with 642 adult male successful smoking quitters as the cases, and 700 adult male failed smoking quitters as the controls. General self-efficacy (GSE), trait coping style (positive-trait coping style (PTCS) and negative-trait coping style (NTCS)) and self-rating anxiety (SA) were evaluated by GSE Scale, Trait Coping Style Questionnaire and SA Scale, respectively. Bayesian network was applied to evaluate the relationship between psychological factors and successful smoking cessation. The local conditional probability table of smoking cessation indicated that different joint conditions of psychological factors led to different outcomes for smoking cessation. Among smokers with high PTCS, high NTCS and low SA, only 36.40% successfully quitted smoking. However, among smokers with low pack-years of smoking, high GSE, high PTCS and high SA, 63.64% successfully quitted smoking. Our study indicates psychological factors jointly influence smoking cessation outcome. According to different joint situations, different solutions should be developed to control tobacco in practical intervention.

  19. Disease mapping and regression with count data in the presence of overdispersion and spatial autocorrelation: a Bayesian model averaging approach.

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew

    2014-01-09

    This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference.

  20. Disease Mapping and Regression with Count Data in the Presence of Overdispersion and Spatial Autocorrelation: A Bayesian Model Averaging Approach

    Directory of Open Access Journals (Sweden)

    Mohammadreza Mohebbi

    2014-01-01

    Full Text Available This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference.

  1. Disease Mapping and Regression with Count Data in the Presence of Overdispersion and Spatial Autocorrelation: A Bayesian Model Averaging Approach

    Science.gov (United States)

    Mohebbi, Mohammadreza; Wolfe, Rory; Forbes, Andrew

    2014-01-01

    This paper applies the generalised linear model for modelling geographical variation to esophageal cancer incidence data in the Caspian region of Iran. The data have a complex and hierarchical structure that makes them suitable for hierarchical analysis using Bayesian techniques, but with care required to deal with problems arising from counts of events observed in small geographical areas when overdispersion and residual spatial autocorrelation are present. These considerations lead to nine regression models derived from using three probability distributions for count data: Poisson, generalised Poisson and negative binomial, and three different autocorrelation structures. We employ the framework of Bayesian variable selection and a Gibbs sampling based technique to identify significant cancer risk factors. The framework deals with situations where the number of possible models based on different combinations of candidate explanatory variables is large enough such that calculation of posterior probabilities for all models is difficult or infeasible. The evidence from applying the modelling methodology suggests that modelling strategies based on the use of generalised Poisson and negative binomial with spatial autocorrelation work well and provide a robust basis for inference. PMID:24413702

  2. A simple approach to fabricate the rose petal-like hierarchical surfaces for droplet transportation

    Science.gov (United States)

    Yuan, Chao; Huang, Mengyu; Yu, Xingjian; Ma, Yupu; Luo, Xiaobing

    2016-11-01

    Precise transportation of liquid microdroplets is a great challenge in the microfluidic field. A sticky superhydrophobic surface with a high static contact angle (CA) and a large contact angle hysteresis (CAH) is recognized as the favorable tool to deal with the challenging job. Some approaches have been proposed to fabricate such surface, such as mimicing the dual-scale hierarchical structure of a natural material, like rose petal. However, the available approaches normally require multiple processing steps or are carried out with great expense. In this study, we report a straightforward and inexpensive method for fabricating the sticky superhydrophobic surfaces. The fabrication relies on electroless galvanic deposition to coat the copper substrates with a textured layer of silver. The whole fabrication process is carried out under ambient conditions by using conventional laboratory materials and equipments, and generally take less than 15 min. Despite the simplicity of this fabrication method, the rose petal-like hierarchical structures and the corresponding sticky superhydrophobic wetting properties were well achieved on the artificial surfaces. For instance, the surface with a deposition time of 10 s exhibits the superhydrophobity with a CA of 151.5°, and the effective stickiness with a CAH of 56.5°. The prepared sticky superhydrophobic surfaces are finally shown in the application of droplet transportation, in which the surface acts as a mechanical hand to grasp and transport the water droplet.

  3. A Bayesian approach for temporally scaling climate for modeling ecological systems

    Science.gov (United States)

    Post van der Burg, Max; Anteau, Michael J.; McCauley, Lisa A.; Wiltermuth, Mark T.

    2016-01-01

    With climate change becoming more of concern, many ecologists are including climate variables in their system and statistical models. The Standardized Precipitation Evapotranspiration Index (SPEI) is a drought index that has potential advantages in modeling ecological response variables, including a flexible computation of the index over different timescales. However, little development has been made in terms of the choice of timescale for SPEI. We developed a Bayesian modeling approach for estimating the timescale for SPEI and demonstrated its use in modeling wetland hydrologic dynamics in two different eras (i.e., historical [pre-1970] and contemporary [post-2003]). Our goal was to determine whether differences in climate between the two eras could explain changes in the amount of water in wetlands. Our results showed that wetland water surface areas tended to be larger in wetter conditions, but also changed less in response to climate fluctuations in the contemporary era. We also found that the average timescale parameter was greater in the historical period, compared with the contemporary period. We were not able to determine whether this shift in timescale was due to a change in the timing of wet–dry periods or whether it was due to changes in the way wetlands responded to climate. Our results suggest that perhaps some interaction between climate and hydrologic response may be at work, and further analysis is needed to determine which has a stronger influence. Despite this, we suggest that our modeling approach enabled us to estimate the relevant timescale for SPEI and make inferences from those estimates. Likewise, our approach provides a mechanism for using prior information with future data to assess whether these patterns may continue over time. We suggest that ecologists consider using temporally scalable climate indices in conjunction with Bayesian analysis for assessing the role of climate in ecological systems.

  4. Refining mass formulas for astrophysical applications: A Bayesian neural network approach

    Science.gov (United States)

    Utama, R.; Piekarewicz, J.

    2017-10-01

    Background: Exotic nuclei, particularly those near the drip lines, are at the core of one of the fundamental questions driving nuclear structure and astrophysics today: What are the limits of nuclear binding? Exotic nuclei play a critical role in both informing theoretical models as well as in our understanding of the origin of the heavy elements. Purpose: Our aim is to refine existing mass models through the training of an artificial neural network that will mitigate the large model discrepancies far away from stability. Methods: The basic paradigm of our two-pronged approach is an existing mass model that captures as much as possible of the underlying physics followed by the implementation of a Bayesian neural network (BNN) refinement to account for the missing physics. Bayesian inference is employed to determine the parameters of the neural network so that model predictions may be accompanied by theoretical uncertainties. Results: Despite the undeniable quality of the mass models adopted in this work, we observe a significant improvement (of about 40%) after the BNN refinement is implemented. Indeed, in the specific case of the Duflo-Zuker mass formula, we find that the rms deviation relative to experiment is reduced from σrms=0.503 MeV to σrms=0.286 MeV. These newly refined mass tables are used to map the neutron drip lines (or rather "drip bands") and to study a few critical r -process nuclei. Conclusions: The BNN approach is highly successful in refining the predictions of existing mass models. In particular, the large discrepancy displayed by the original "bare" models in regions where experimental data are unavailable is considerably quenched after the BNN refinement. This lends credence to our approach and has motivated us to publish refined mass tables that we trust will be helpful for future astrophysical applications.

  5. Test anxiety and the hierarchical model of approach and avoidance achievement motivation.

    Science.gov (United States)

    Elliot, A J; McGregor, H A

    1999-04-01

    This research was designed to incorporate the test anxiety (TA) construct into the hierarchical model of approach and avoidance achievement motivation. Hypotheses regarding state and trait TA were tested in 2 studies, and the results provided strong support for the predictions. State TA (specifically, worry) was documented as a mediator of the negative relationship between performance-avoidance goals and exam performance. The positive relationship between performance-approach goals and exam performance was shown to be independent of TA processes. A series of analyses documented the conceptual and functional convergence of trait TA and fear of failure (FOF), and further validation of the proposed integration was obtained by testing trait TA/FOF and state TA together in the same model. Mastery goals were positively and performance-avoidance goals negatively related to long-term retention.

  6. A hierarchical approach to ecological assessment of contaminated soils at Aberdeen Proving Ground, USA

    Energy Technology Data Exchange (ETDEWEB)

    Kuperman, R.G.

    1995-12-31

    Despite the expansion of environmental toxicology studies over the past decade, soil ecosystems have largely been ignored in ecotoxicological studies in the United States. The objective of this project was to develop and test the efficacy of a comprehensive methodology for assessing ecological impacts of soil contamination. A hierarchical approach that integrates biotic parameters and ecosystem processes was used to give insight into the mechanisms that lead to alterations in the structure and function of soil ecosystems in contaminated areas. This approach involved (1) a thorough survey of the soil biota to determine community structure, (2) laboratory and field tests on critical ecosystem processes, (3) toxicity trials, and (4) the use of spatial analyses to provide input to the decision-making, process. This methodology appears to, offer an efficient and potentially cost-saving tool for remedial investigations of contaminated sites.

  7. Modeling clinical outcome using multiple correlated functional biomarkers: A Bayesian approach.

    Science.gov (United States)

    Long, Qi; Zhang, Xiaoxi; Zhao, Yize; Johnson, Brent A; Bostick, Roberd M

    2016-04-01

    In some biomedical studies, biomarkers are measured repeatedly along some spatial structure or over time and are subject to measurement error. In these studies, it is often of interest to evaluate associations between a clinical endpoint and these biomarkers (also known as functional biomarkers). There are potentially two levels of correlation in such data, namely, between repeated measurements of a biomarker from the same subject and between multiple biomarkers from the same subject; none of the existing methods accounts for correlation between multiple functional biomarkers. We propose a Bayesian approach to model a clinical outcome of interest (e.g. risk for colorectal cancer) in the presence of multiple functional biomarkers while accounting for potential correlation. Our simulations show that the proposed approach achieves good performance in finite samples under various settings. In the presence of substantial or moderate correlation, the proposed approach outperforms an existing approach that does not account for correlation. The proposed approach is applied to a study of biomarkers of risk for colorectal neoplasms and our results show that the risk for colorectal cancer is associated with two functional biomarkers, APC and TGF-α, in particular, with their values in the region between the proliferating and differentiating zones of colorectal crypts. © The Author(s) 2012.

  8. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.

    2016-01-01

    to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based......A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify...

  9. An efficient multiple particle filter based on the variational Bayesian approach

    KAUST Repository

    Ait-El-Fquih, Boujemaa

    2015-12-07

    This paper addresses the filtering problem in large-dimensional systems, in which conventional particle filters (PFs) remain computationally prohibitive owing to the large number of particles needed to obtain reasonable performances. To overcome this drawback, a class of multiple particle filters (MPFs) has been recently introduced in which the state-space is split into low-dimensional subspaces, and then a separate PF is applied to each subspace. In this paper, we adopt the variational Bayesian (VB) approach to propose a new MPF, the VBMPF. The proposed filter is computationally more efficient since the propagation of each particle requires generating one (new) particle only, while in the standard MPFs a set of (children) particles needs to be generated. In a numerical test, the proposed VBMPF behaves better than the PF and MPF.

  10. A Bayesian Approach to the Orientations of Central Alentejo Megalithic Enclosures

    Science.gov (United States)

    Pimenta, Fernando; Tirapicos, Luís; Smith, Andrew

    2009-12-01

    In this work we have conducted a study on the orientations in the landscape of twelve megalithic enclosures in the Alentejo region of southern Portugal. Some of these sites date back to the sixth or fifth millennium B.C. and are among the oldest stone enclosures in Europe. The results of the survey show a pattern toward eastern rising orientations. We used dedicated GIS software from one of the authors to produce horizon profiles and applied a statistical Bayesian approach in an attempt to check how the data would fit to different models. In particular, we tested our results for a possible ritual interest in the Autumn or Harvest Full Moon and discuss previous studies by Michael Hoskin and colleges on the orientations of seven stone dolmens of this area that have shown the existence of a possible custom for an orientation toward the sunrise.

  11. Applying a Bayesian Approach to Identification of Orthotropic Elastic Constants from Full Field Displacement Measurements

    Directory of Open Access Journals (Sweden)

    Le Riche R.

    2010-06-01

    Full Text Available A major challenge in the identification of material properties is handling different sources of uncertainty in the experiment and the modelling of the experiment for estimating the resulting uncertainty in the identified properties. Numerous improvements in identification methods have provided increasingly accurate estimates of various material properties. However, characterizing the uncertainty in the identified properties is still relatively crude. Different material properties obtained from a single test are not obtained with the same confidence. Typically the highest uncertainty is associated with respect to properties to which the experiment is the most insensitive. In addition, the uncertainty in different properties can be strongly correlated, so that obtaining only variance estimates may be misleading. A possible approach for handling the different sources of uncertainty and estimating the uncertainty in the identified properties is the Bayesian method. This method was introduced in the late 1970s in the context of identification [1] and has been applied since to different problems, notably identification of elastic constants from plate vibration experiments [2]-[4]. The applications of the method to these classical pointwise tests involved only a small number of measurements (typically ten natural frequencies in the previously cited vibration test which facilitated the application of the Bayesian approach. For identifying elastic constants, full field strain or displacement measurements provide a high number of measured quantities (one measurement per image pixel and hence a promise of smaller uncertainties in the properties. However, the high number of measurements represents also a major computational challenge in applying the Bayesian approach to full field measurements. To address this challenge we propose an approach based on the proper orthogonal decomposition (POD of the full fields in order to drastically reduce their

  12. Faculty Development for Fostering Clinical Reasoning Skills in Early Medical Students Using a Modified Bayesian Approach.

    Science.gov (United States)

    Addy, Tracie Marcella; Hafler, Janet; Galerneau, France

    2016-01-01

    Clinical reasoning is a necessary skill for medical students to acquire in the course of their education, and there is evidence that they can start this process at the undergraduate level. However, physician educators who are experts in their given fields may have difficulty conveying their complex thought processes to students. Providing faculty development that equips educators with tools to teach clinical reasoning may support skill development in early medical students. We provided faculty development on a modified Bayesian method of teaching clinical reasoning to clinician educators who facilitated small-group, case-based workshops with 2nd-year medical students. We interviewed them before and after the module regarding their perceptions on teaching clinical reasoning. We solicited feedback from the students about the effectiveness of the method in developing their clinical reasoning skills. We carried out this project during an institutional curriculum rebuild where clinical reasoning was a defined goal. At the time of the intervention, there was also increased involvement of the Teaching and Learning Center in elevating the status of teaching and learning. There was high overall satisfaction with the faculty development program. Both the faculty and the students described the modified Bayesian approach as effective in fostering the development of clinical reasoning skills. Through this work, we learned how to form a beneficial partnership between a clinician educator and Teaching and Learning Center to promote faculty development on a clinical reasoning teaching method for early medical students. We uncovered challenges faced by both faculty and early learners in this study. We observed that our faculty chose to utilize the method of teaching clinical reasoning in a variety of manners in the classroom. Despite obstacles and differing approaches utilized, we believe that this model can be emulated at other institutions to foster the development of clinical

  13. Genome-wide identification of conserved intronic non-coding sequences using a Bayesian segmentation approach.

    Science.gov (United States)

    Algama, Manjula; Tasker, Edward; Williams, Caitlin; Parslow, Adam C; Bryson-Richardson, Robert J; Keith, Jonathan M

    2017-03-27

    Computational identification of non-coding RNAs (ncRNAs) is a challenging problem. We describe a genome-wide analysis using Bayesian segmentation to identify intronic elements highly conserved between three evolutionarily distant vertebrate species: human, mouse and zebrafish. We investigate the extent to which these elements include ncRNAs (or conserved domains of ncRNAs) and regulatory sequences. We identified 655 deeply conserved intronic sequences in a genome-wide analysis. We also performed a pathway-focussed analysis on genes involved in muscle development, detecting 27 intronic elements, of which 22 were not detected in the genome-wide analysis. At least 87% of the genome-wide and 70% of the pathway-focussed elements have existing annotations indicative of conserved RNA secondary structure. The expression of 26 of the pathway-focused elements was examined using RT-PCR, providing confirmation that they include expressed ncRNAs. Consistent with previous studies, these elements are significantly over-represented in the introns of transcription factors. This study demonstrates a novel, highly effective, Bayesian approach to identifying conserved non-coding sequences. Our results complement previous findings that these sequences are enriched in transcription factors. However, in contrast to previous studies which suggest the majority of conserved sequences are regulatory factor binding sites, the majority of conserved sequences identified using our approach contain evidence of conserved RNA secondary structures, and our laboratory results suggest most are expressed. Functional roles at DNA and RNA levels are not mutually exclusive, and many of our elements possess evidence of both. Moreover, ncRNAs play roles in transcriptional and post-transcriptional regulation, and this may contribute to the over-representation of these elements in introns of transcription factors. We attribute the higher sensitivity of the pathway-focussed analysis compared to the genome

  14. Inference of reactive transport model parameters using a Bayesian multivariate approach

    Science.gov (United States)

    Carniato, Luca; Schoups, Gerrit; van de Giesen, Nick

    2014-08-01

    Parameter estimation of subsurface transport models from multispecies data requires the definition of an objective function that includes different types of measurements. Common approaches are weighted least squares (WLS), where weights are specified a priori for each measurement, and weighted least squares with weight estimation (WLS(we)) where weights are estimated from the data together with the parameters. In this study, we formulate the parameter estimation task as a multivariate Bayesian inference problem. The WLS and WLS(we) methods are special cases in this framework, corresponding to specific prior assumptions about the residual covariance matrix. The Bayesian perspective allows for generalizations to cases where residual correlation is important and for efficient inference by analytically integrating out the variances (weights) and selected covariances from the joint posterior. Specifically, the WLS and WLS(we) methods are compared to a multivariate (MV) approach that accounts for specific residual correlations without the need for explicit estimation of the error parameters. When applied to inference of reactive transport model parameters from column-scale data on dissolved species concentrations, the following results were obtained: (1) accounting for residual correlation between species provides more accurate parameter estimation for high residual correlation levels whereas its influence for predictive uncertainty is negligible, (2) integrating out the (co)variances leads to an efficient estimation of the full joint posterior with a reduced computational effort compared to the WLS(we) method, and (3) in the presence of model structural errors, none of the methods is able to identify the correct parameter values.

  15. Effects of Green Tea Gargling on the Prevention of Influenza Infection: An Analysis Using Bayesian Approaches.

    Science.gov (United States)

    Ide, Kazuki; Kawasaki, Yohei; Akutagawa, Maiko; Yamada, Hiroshi

    2017-02-01

    The aim of this study is to analyze the data obtained from a randomized trial on the prevention of influenza by gargling with green tea, which gave nonsignificant results based on frequentist approaches, by using Bayesian approaches. The posterior proportion, with 95% credible interval (CrI), of influenza in each group was calculated. The Bayesian index θ is the probability that a hypothesis is true. In this case, θ is the probability that the hypothesis that green tea gargling reduced influenza compared with water gargling is true. Univariate and multivariate logistic regression analyses were also performed by using the Markov chain Monte Carlo method. The full analysis set included 747 participants. During the study period, influenza occurred in 44 participants (5.9%). The difference between the two independent binominal proportions was -0.019 (95% CrI, -0.054 to 0.015; θ = 0.87). The partial regression coefficients in the univariate analysis were -0.35 (95% CrI, -1.00 to 0.24) with use of a uniform prior and -0.34 (95% CrI, -0.96 to 0.27) with use of a Jeffreys prior. In the multivariate analysis, the values were -0.37 (95% CrI, -0.96 to 0.30) and -0.36 (95% CrI, -1.03 to 0.21), respectively. The difference between the two independent binominal proportions was less than 0, and θ was greater than 0.85. Therefore, green tea gargling may slightly reduce influenza compared with water gargling. This analysis suggests that green tea gargling can be an additional preventive measure for use with other pharmaceutical and nonpharmaceutical measures and indicates the need for additional studies to confirm the effect of green tea gargling.

  16. Exploring predictions of abundance from body mass using hierarchical comparative approaches.

    Science.gov (United States)

    McGill, Brian J

    2008-07-01

    Understanding and predicting how and why abundance varies is one of the central questions in ecology. One of the few consistent predictors of variation in abundance between species has been body mass, but the nature of this relationship has been contentious. Here I explore the relationship between body mass and abundance in birds of North America, using hierarchical partitioning of variance and regressions at taxonomic levels above the species. These analyses show that much variation in abundance is found across space, while a moderate amount of variation is found at the species/genus and also at the family/order level. However, body size and trophic level primarily vary at the family/order level, suggesting that mechanisms based on body size and energy should primarily explain only this moderate-sized, taxonomically conserved component of variation in abundance. Body size does explain more than 50% of the variation at this level (and almost 75% when trophic level is also included). This tighter relationship makes clear that energetic equivalence (slope = -3/4) sets an upper limit but does not describe the relationship between body mass and average abundance for birds of North America. Finally, I suggest that this hierarchical, multivariate approach should be used more often in macroecology.

  17. Multi-scale hierarchical approach for parametric mapping: assessment on multi-compartmental models.

    Science.gov (United States)

    Rizzo, G; Turkheimer, F E; Bertoldo, A

    2013-02-15

    This paper investigates a new hierarchical method to apply basis function to mono- and multi-compartmental models (Hierarchical-Basis Function Method, H-BFM) at a voxel level. This method identifies the parameters of the compartmental model in its nonlinearized version, integrating information derived at the region of interest (ROI) level by segmenting the cerebral volume based on anatomical definition or functional clustering. We present the results obtained by using a two tissue-four rate constant model with two different tracers ([(11)C]FLB457 and [carbonyl-(11)C]WAY100635), one of the most complex models used in receptor studies, especially at the voxel level. H-BFM is robust and its application on both [(11)C]FLB457 and [carbonyl-(11)C]WAY100635 allows accurate and precise parameter estimates, good quality parametric maps and a low percentage of voxels out of physiological bound (modeling at the voxel level. In particular, different from other proposed approaches, this method can also be used when the linearization of the model is not appropriate. We expect that applying it to clinical data will generate reliable parametric maps. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. What would judgment and decision making research be like if we took a Bayesian approach to hypothesis testing?

    Directory of Open Access Journals (Sweden)

    William J. Matthews

    2011-12-01

    Full Text Available Judgment and decision making research overwhelmingly uses null hypothesis significance testing as the basis for statistical inference. This article examines an alternative, Bayesian approach which emphasizes the choice between two competing hypotheses and quantifies the balance of evidence provided by the data---one consequence of which is that experimental results may be taken to strongly favour the null hypothesis. We apply a recently-developed ``Bayesian $t$-test'' to existing studies of the anchoring effect in judgment, and examine how the change in approach affects both the tone of hypothesis testing and the substantive conclusions that one draws. We compare the Bayesian approach with Fisherian and Neyman-Pearson testing, examining its relationship to conventional $p$-values, the influence of effect size, and the importance of prior beliefs about the likely state of nature. The results give a sense of how Bayesian hypothesis testing might be applied to judgment and decision making research, and of both the advantages and challenges that a shift to this approach would entail.

  19. Psychological Needs, Engagement, and Work Intentions: A Bayesian Multi-Measurement Mediation Approach and Implications for HRD

    Science.gov (United States)

    Shuck, Brad; Zigarmi, Drea; Owen, Jesse

    2015-01-01

    Purpose: The purpose of this study was to empirically examine the utility of self-determination theory (SDT) within the engagement-performance linkage. Design/methodology/approach: Bayesian multi-measurement mediation modeling was used to estimate the relation between SDT, engagement and a proxy measure of performance (e.g. work intentions) (N =…

  20. A Bayesian Approach to Excess Volatility, Short-term Underreaction and Long-term Overreaction during Financial Crises

    NARCIS (Netherlands)

    X. Guo (Xu); M.J. McAleer (Michael); W.-K. Wong (Wing-Keung); L. Zhu (Lixing)

    2016-01-01

    textabstractIn this paper, we introduce a new Bayesian approach to explain some market anomalies during financial crises and subsequent recovery. We assume that the earnings shock of an asset follows a random walk model with and without drift to incorporate the impact of financial crises. We further

  1. Nuclear mass predictions based on Bayesian neural network approach with pairing and shell effects

    Science.gov (United States)

    Niu, Z. M.; Liang, H. Z.

    2018-03-01

    Bayesian neural network (BNN) approach is employed to improve the nuclear mass predictions of various models. It is found that the noise error in the likelihood function plays an important role in the predictive performance of the BNN approach. By including a distribution for the noise error, an appropriate value can be found automatically in the sampling process, which optimizes the nuclear mass predictions. Furthermore, two quantities related to nuclear pairing and shell effects are added to the input layer in addition to the proton and mass numbers. As a result, the theoretical accuracies are significantly improved not only for nuclear masses but also for single-nucleon separation energies. Due to the inclusion of the shell effect, in the unknown region, the BNN approach predicts a similar shell-correction structure to that in the known region, e.g., the predictions of underestimation of nuclear mass around the magic numbers in the relativistic mean-field model. This manifests that better predictive performance can be achieved if more physical features are included in the BNN approach.

  2. A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning.

    Directory of Open Access Journals (Sweden)

    Michael Jae-Yoon Chung

    Full Text Available A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i learn probabilistic models of actions through self-discovery and experience, (ii utilize these learned models for inferring the goals of human actions, and (iii perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i a simulated robot that learns human-like gaze following behavior, and (ii a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration.

  3. A Bayesian approach for estimating calibration curves and unknown concentrations in immunoassays.

    Science.gov (United States)

    Feng, Feng; Sales, Ana Paula; Kepler, Thomas B

    2011-03-01

    Immunoassays are primary diagnostic and research tools throughout the medical and life sciences. The common approach to the processing of immunoassay data involves estimation of the calibration curve followed by inversion of the calibration function to read off the concentration estimates. This approach, however, does not lend itself easily to acceptable estimation of confidence limits on the estimated concentrations. Such estimates must account for uncertainty in the calibration curve as well as uncertainty in the target measurement. Even point estimates can be problematic: because of the non-linearity of calibration curves and error heteroscedasticity, the neglect of components of measurement error can produce significant bias. We have developed a Bayesian approach for the estimation of concentrations from immunoassay data that treats the propagation of measurement error appropriately. The method uses Markov Chain Monte Carlo (MCMC) to approximate the posterior distribution of the target concentrations and numerically compute the relevant summary statistics. Software implementing the method is freely available for public use. The new method was tested on both simulated and experimental datasets with different measurement error models. The method outperformed the common inverse method on samples with large measurement errors. Even in cases with extreme measurements where the common inverse method failed, our approach always generated reasonable estimates for the target concentrations. Project name: Baecs; Project home page: www.computationalimmunology.org/utilities/; Operating systems: Linux, MacOS X and Windows; Programming language: C++; License: Free for Academic Use.

  4. A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning.

    Science.gov (United States)

    Chung, Michael Jae-Yoon; Friesen, Abram L; Fox, Dieter; Meltzoff, Andrew N; Rao, Rajesh P N

    2015-01-01

    A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i) learn probabilistic models of actions through self-discovery and experience, (ii) utilize these learned models for inferring the goals of human actions, and (iii) perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i) a simulated robot that learns human-like gaze following behavior, and (ii) a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration.

  5. A Bayesian Network approach to the evaluation of building design and its consequences for employee performance and operational costs

    DEFF Research Database (Denmark)

    Jensen, Kasper Lynge; Toftum, Jørn; Friis-Hansen, Peter

    2009-01-01

    A Bayesian Network approach has been developed that can compare different building designs by estimating the effects of the thermal indoor environment on the mental performance of office workers. A part of this network is based on the compilation of subjective thermal sensation data...... that investments in improved indoor thermal conditions can be justified economically in most cases. The Bayesian Network provides a reliable platform using probabilities for modelling the complexity while estimating the effect of indoor climate factors on human beings, due to the different ways in which humans...

  6. Hierarchical XP

    OpenAIRE

    Jacobi, Carsten; Rumpe, Bernhard

    2014-01-01

    XP is a light-weight methodology suited particularly for small-sized teams that develop software which has only vague or rapidly changing requirements. The discipline of systems engineering knows it as approach of incremental system change or also of "muddling through". In this paper, we introduce three well known methods of reorganizing companies, namely, the holistic approach, the incremental approach, and the hierarchical approach. We show similarities between software engineering methods ...

  7. Bayesian Recovery of Clipped OFDM Signals: A Receiver-based Approach

    KAUST Repository

    Al-Rabah, Abdullatif R.

    2013-05-01

    Recently, orthogonal frequency-division multiplexing (OFDM) has been adopted for high-speed wireless communications due to its robustness against multipath fading. However, one of the main fundamental drawbacks of OFDM systems is the high peak-to-average-power ratio (PAPR). Several techniques have been proposed for PAPR reduction. Most of these techniques require transmitter-based (pre-compensated) processing. On the other hand, receiver-based alternatives would save the power and reduce the transmitter complexity. By keeping this in mind, a possible approach is to limit the amplitude of the OFDM signal to a predetermined threshold and equivalently a sparse clipping signal is added. Then, estimating this clipping signal at the receiver to recover the original signal. In this work, we propose a Bayesian receiver-based low-complexity clipping signal recovery method for PAPR reduction. The method is able to i) effectively reduce the PAPR via simple clipping scheme at the transmitter side, ii) use Bayesian recovery algorithm to reconstruct the clipping signal at the receiver side by measuring part of subcarriers, iii) perform well in the absence of statistical information about the signal (e.g. clipping level) and the noise (e.g. noise variance), and at the same time iv is energy efficient due to its low complexity. Specifically, the proposed recovery technique is implemented in data-aided based. The data-aided method collects clipping information by measuring reliable 
data subcarriers, thus makes full use of spectrum for data transmission without the need for tone reservation. The study is extended further to discuss how to improve the recovery of the clipping signal utilizing some features of practical OFDM systems i.e., the oversampling and the presence of multiple receivers. Simulation results demonstrate the superiority of the proposed technique over other recovery algorithms. The overall objective is to show that the receiver-based Bayesian technique is highly

  8. On merging rainfall data from diverse sources using a Bayesian approach

    Science.gov (United States)

    Bhattacharya, Biswa; Tarekegn, Tegegne

    2014-05-01

    Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the

  9. Estimating the long-term phosphorus accretion rate in the Everglades: A Bayesian approach with risk assessment

    Science.gov (United States)

    Qian, Song S.; Richardson, Curtis J.

    Using wetlands as a sink of nutrients, phosphorus in particular, is becoming an increasingly attractive alternative to conventional wastewater treatment technology. In this paper, we briefly review the mechanism of phosphorus retention in wetlands, as well as previous modeling efforts. A Bayesian method is then proposed for estimating the long-term phosphorus accretion rate in wetlands through a piecewise linear model of outflow phosphorus concentration and phosphorus mass loading rate. The Bayesian approach was used for its simplicity in computation and its ability to accurately represent uncertainty. Applied to an Everglades wetland, the Bayesian method not only produced the probability distribution of the long-term phosphorus accretion rate but also generated a relationship of acceptable level of ``risk'' and optimal phosphorus mass loading rate for the proposed constructed wetlands in south Florida. The latter is a useful representation of uncertainty which is of interest to decision makers.

  10. An ontology-based hierarchical semantic modeling approach to clinical pathway workflows.

    Science.gov (United States)

    Ye, Yan; Jiang, Zhibin; Diao, Xiaodi; Yang, Dong; Du, Gang

    2009-08-01

    This paper proposes an ontology-based approach of modeling clinical pathway workflows at the semantic level for facilitating computerized clinical pathway implementation and efficient delivery of high-quality healthcare services. A clinical pathway ontology (CPO) is formally defined in OWL web ontology language (OWL) to provide common semantic foundation for meaningful representation and exchange of pathway-related knowledge. A CPO-based semantic modeling method is then presented to describe clinical pathways as interconnected hierarchical models including the top-level outcome flow and intervention workflow level along a care timeline. Furthermore, relevant temporal knowledge can be fully represented by combing temporal entities in CPO and temporal rules based on semantic web rule language (SWRL). An illustrative example about a clinical pathway for cesarean section shows the applicability of the proposed methodology in enabling structured semantic descriptions of any real clinical pathway.

  11. A Hierarchical Approach Using Machine Learning Methods in Solar Photovoltaic Energy Production Forecasting

    Directory of Open Access Journals (Sweden)

    Zhaoxuan Li

    2016-01-01

    Full Text Available We evaluate and compare two common methods, artificial neural networks (ANN and support vector regression (SVR, for predicting energy productions from a solar photovoltaic (PV system in Florida 15 min, 1 h and 24 h ahead of time. A hierarchical approach is proposed based on the machine learning algorithms tested. The production data used in this work corresponds to 15 min averaged power measurements collected from 2014. The accuracy of the model is determined using computing error statistics such as mean bias error (MBE, mean absolute error (MAE, root mean square error (RMSE, relative MBE (rMBE, mean percentage error (MPE and relative RMSE (rRMSE. This work provides findings on how forecasts from individual inverters will improve the total solar power generation forecast of the PV system.

  12. Modelling developmental instability as the joint action of noise and stability: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Lens Luc

    2002-06-01

    Full Text Available Abstract Background Fluctuating asymmetry is assumed to measure individual and population level developmental stability. The latter may in turn show an association with stress, which can be observed through asymmetry-stress correlations. However, the recent literature does not support an ubiquitous relationship. Very little is known why some studies show relatively strong associations while others completely fail to find such a correlation. We propose a new Bayesian statistical framework to examine these associations Results We are considering developmental stability – i.e. the individual buffering capacity – as the biologically relevant trait and show that (i little variation in developmental stability can explain observed variation in fluctuating asymmetry when the distribution of developmental stability is highly skewed, and (ii that a previously developed tool (i.e. the hypothetical repeatability of fluctuating asymmetry contains only limited information about variation in developmental stability, which stands in sharp contrast to the earlier established close association between the repeatability and developmental instability. Conclusion We provide tools to generate valuable information about the distribution of between-individual variation in developmental stability. A simple linear transformation of a previous model lead to completely different conclusions. Thus, theoretical modelling of asymmetry and stability appears to be very sensitive to the scale of inference. More research is urgently needed to get better insights in the developmental mechanisms of noise and stability. In spite of the fact that the model is likely to represent an oversimplification of reality, the accumulation of new insights could be incorporated in the Bayesian statistical approach to obtain more reliable estimation.

  13. Obtaining parsimonious hydraulic conductivity fields using head and transport observations: A bayesian geostatistical parameter estimation approach

    Science.gov (United States)

    Fienen, M.; Hunt, R.; Krabbenhoft, D.; Clemo, T.

    2009-01-01

    Flow path delineation is a valuable tool for interpreting the subsurface hydrogeochemical environment. Different types of data, such as groundwater flow and transport, inform different aspects of hydrogeologie parameter values (hydraulic conductivity in this case) which, in turn, determine flow paths. This work combines flow and transport information to estimate a unified set of hydrogeologic parameters using the Bayesian geostatistical inverse approach. Parameter flexibility is allowed by using a highly parameterized approach with the level of complexity informed by the data. Despite the effort to adhere to the ideal of minimal a priori structure imposed on the problem, extreme contrasts in parameters can result in the need to censor correlation across hydrostratigraphic bounding surfaces. These partitions segregate parameters into faci??s associations. With an iterative approach in which partitions are based on inspection of initial estimates, flow path interpretation is progressively refined through the inclusion of more types of data. Head observations, stable oxygen isotopes (18O/16O) ratios), and tritium are all used to progressively refine flow path delineation on an isthmus between two lakes in the Trout Lake watershed, northern Wisconsin, United States. Despite allowing significant parameter freedom by estimating many distributed parameter values, a smooth field is obtained. Copyright 2009 by the American Geophysical Union.

  14. A Bayesian compressed-sensing approach for reconstructing neural connectivity from subsampled anatomical data.

    Science.gov (United States)

    Mishchenko, Yuriy; Paninski, Liam

    2012-10-01

    In recent years, the problem of reconstructing the connectivity in large neural circuits ("connectomics") has re-emerged as one of the main objectives of neuroscience. Classically, reconstructions of neural connectivity have been approached anatomically, using electron or light microscopy and histological tracing methods. This paper describes a statistical approach for connectivity reconstruction that relies on relatively easy-to-obtain measurements using fluorescent probes such as synaptic markers, cytoplasmic dyes, transsynaptic tracers, or activity-dependent dyes. We describe the possible design of these experiments and develop a Bayesian framework for extracting synaptic neural connectivity from such data. We show that the statistical reconstruction problem can be formulated naturally as a tractable L₁-regularized quadratic optimization. As a concrete example, we consider a realistic hypothetical connectivity reconstruction experiment in C. elegans, a popular neuroscience model where a complete wiring diagram has been previously obtained based on long-term electron microscopy work. We show that the new statistical approach could lead to an orders of magnitude reduction in experimental effort in reconstructing the connectivity in this circuit. We further demonstrate that the spatial heterogeneity and biological variability in the connectivity matrix--not just the "average" connectivity--can also be estimated using the same method.

  15. Hierarchical matrices implemented into the boundary integral approaches for gravity field modelling

    Science.gov (United States)

    Čunderlík, Róbert; Vipiana, Francesca

    2017-04-01

    Boundary integral approaches applied for gravity field modelling have been recently developed to solve the geodetic boundary value problems numerically, or to process satellite observations, e.g. from the GOCE satellite mission. In order to obtain numerical solutions of "cm-level" accuracy, such approaches require very refined level of the disretization or resolution. This leads to enormous memory requirements that need to be reduced. An implementation of the Hierarchical Matrices (H-matrices) can significantly reduce a numerical complexity of these approaches. A main idea of the H-matrices is based on an approximation of the entire system matrix that is split into a family of submatrices. Large submatrices are stored in factorized representation, while small submatrices are stored in standard representation. This allows reducing memory requirements significantly while improving the efficiency. The poster presents our preliminary results of implementations of the H-matrices into the existing boundary integral approaches based on the boundary element method or the method of fundamental solution.

  16. A Bayesian approach to landscape ecological risk assessment applied to the upper Grande Ronde watershed, Oregon

    Science.gov (United States)

    Kimberley K. Ayre; Wayne G. Landis

    2012-01-01

    We present a Bayesian network model based on the ecological risk assessment framework to evaluate potential impacts to habitats and resources resulting from wildfire, grazing, forest management activities, and insect outbreaks in a forested landscape in northeastern Oregon. The Bayesian network structure consisted of three tiers of nodes: landscape disturbances,...

  17. Modeling of Academic Achievement of Primary School Students in Ethiopia Using Bayesian Multilevel Approach

    Science.gov (United States)

    Sebro, Negusse Yohannes; Goshu, Ayele Taye

    2017-01-01

    This study aims to explore Bayesian multilevel modeling to investigate variations of average academic achievement of grade eight school students. A sample of 636 students is randomly selected from 26 private and government schools by a two-stage stratified sampling design. Bayesian method is used to estimate the fixed and random effects. Input and…

  18. Bridging the gap between aggregate data and individual patient management: a Bayesian approach

    NARCIS (Netherlands)

    Wilt, G.J. van der; Groenewoud, H.; Riel, P.L.C.M. van

    2011-01-01

    OBJECTIVES: The aim of this study was to explore whether Bayesian reasoning can be applied to therapeutic questions in a way that is similar to its application in diagnostics. METHODS: A clinically relevant, therapeutic question was formulated in accordance with Bayesian reasoning for the clinical

  19. A Bayesian Network Approach to Modeling Learning Progressions and Task Performance. CRESST Report 776

    Science.gov (United States)

    West, Patti; Rutstein, Daisy Wise; Mislevy, Robert J.; Liu, Junhui; Choi, Younyoung; Levy, Roy; Crawford, Aaron; DiCerbo, Kristen E.; Chappel, Kristina; Behrens, John T.

    2010-01-01

    A major issue in the study of learning progressions (LPs) is linking student performance on assessment tasks to the progressions. This report describes the challenges faced in making this linkage using Bayesian networks to model LPs in the field of computer networking. The ideas are illustrated with exemplar Bayesian networks built on Cisco…

  20. A Bayesian Game-Theoretic Approach for Distributed Resource Allocation in Fading Multiple Access Channels

    Directory of Open Access Journals (Sweden)

    Gaoning He

    2010-01-01

    Full Text Available A Bayesian game-theoretic model is developed to design and analyze the resource allocation problem in K-user fading multiple access channels (MACs, where the users are assumed to selfishly maximize their average achievable rates with incomplete information about the fading channel gains. In such a game-theoretic study, the central question is whether a Bayesian equilibrium exists, and if so, whether the network operates efficiently at the equilibrium point. We prove that there exists exactly one Bayesian equilibrium in our game. Furthermore, we study the network sum-rate maximization problem by assuming that the users coordinate according to a symmetric strategy profile. This result also serves as an upper bound for the Bayesian equilibrium. Finally, simulation results are provided to show the network efficiency at the unique Bayesian equilibrium and to compare it with other strategies.

  1. Network meta-analysis: development of a three-level hierarchical modeling approach incorporating dose-related constraints.

    Science.gov (United States)

    Owen, Rhiannon K; Tincello, Douglas G; Keith, R Abrams

    2015-01-01

    Network meta-analysis (NMA) is commonly used in evidence synthesis; however, in situations in which there are a large number of treatment options, which may be subdivided into classes, and relatively few trials, NMAs produce considerable uncertainty in the estimated treatment effects, and consequently, identification of the most beneficial intervention remains inconclusive. To develop and demonstrate the use of evidence synthesis methods to evaluate extensive treatment networks with a limited number of trials, making use of classes. Using Bayesian Markov chain Monte Carlo methods, we build on the existing work of a random effects NMA to develop a three-level hierarchical NMA model that accounts for the exchangeability between treatments within the same class as well as for the residual between-study heterogeneity. We demonstrate the application of these methods to a continuous and binary outcome, using a motivating example of overactive bladder. We illustrate methods for incorporating ordering constraints in increasing doses, model selection, and assessing inconsistency between the direct and indirect evidence. The methods were applied to a data set obtained from a systematic literature review of trials for overactive bladder, evaluating the mean reduction in incontinence episodes from baseline and the number of patients reporting one or more adverse events. The data set involved 72 trials comparing 34 interventions that were categorized into nine classes of interventions, including placebo. Bayesian three-level hierarchical NMAs have the potential to increase the precision in the effect estimates while maintaining the interpretability of the individual interventions for decision making. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  2. A multinomial logit model-Bayesian network hybrid approach for driver injury severity analyses in rear-end crashes.

    Science.gov (United States)

    Chen, Cong; Zhang, Guohui; Tarefder, Rafiqul; Ma, Jianming; Wei, Heng; Guan, Hongzhi

    2015-07-01

    Rear-end crash is one of the most common types of traffic crashes in the U.S. A good understanding of its characteristics and contributing factors is of practical importance. Previously, both multinomial Logit models and Bayesian network methods have been used in crash modeling and analysis, respectively, although each of them has its own application restrictions and limitations. In this study, a hybrid approach is developed to combine multinomial logit models and Bayesian network methods for comprehensively analyzing driver injury severities in rear-end crashes based on state-wide crash data collected in New Mexico from 2010 to 2011. A multinomial logit model is developed to investigate and identify significant contributing factors for rear-end crash driver injury severities classified into three categories: no injury, injury, and fatality. Then, the identified significant factors are utilized to establish a Bayesian network to explicitly formulate statistical associations between injury severity outcomes and explanatory attributes, including driver behavior, demographic features, vehicle factors, geometric and environmental characteristics, etc. The test results demonstrate that the proposed hybrid approach performs reasonably well. The Bayesian network reference analyses indicate that the factors including truck-involvement, inferior lighting conditions, windy weather conditions, the number of vehicles involved, etc. could significantly increase driver injury severities in rear-end crashes. The developed methodology and estimation results provide insights for developing effective countermeasures to reduce rear-end crash injury severities and improve traffic system safety performance. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Using Bayesian network and AHP method as a marketing approach tools in defining tourists’ preferences

    Directory of Open Access Journals (Sweden)

    Nataša Papić-Blagojević

    2012-04-01

    Full Text Available Marketing approach is associated to market conditions and achieving long term profitability of a company by satisfying consumers’ needs. This approach in tourism does not have to be related only to promoting one touristic destination, but is associated to relation between travel agency and its clients too. It considers that travel agencies adjust their offers to their clients’ needs. In that sense, it is important to analyze the behavior of tourists in the earlier periods with consideration of their preferences. Using Bayesian network, it could be graphically displayed the connection between tourists who have similar taste and relationships between them. On the other hand, the analytic hierarchy process (AHP is used to rank tourist attractions, with also relying on past experience. In this paper we examine possible applications of these two models in tourism in Serbia. The example is hypothetical, but it will serve as a base for future research. Three types of tourism are chosen as a representative in Vojvodina: Cultural, Rural and Business tourism, because they are the bright spot of touristic development in this area. Applied on these forms, analytic hierarchy process has shown its strength in predicting tourists’ preferences.

  4. A Bayesian approach to efficient differential allocation for resampling-based significance testing

    Directory of Open Access Journals (Sweden)

    Soi Sameer

    2009-06-01

    Full Text Available Abstract Background Large-scale statistical analyses have become hallmarks of post-genomic era biological research due to advances in high-throughput assays and the integration of large biological databases. One accompanying issue is the simultaneous estimation of p-values for a large number of hypothesis tests. In many applications, a parametric assumption in the null distribution such as normality may be unreasonable, and resampling-based p-values are the preferred procedure for establishing statistical significance. Using resampling-based procedures for multiple testing is computationally intensive and typically requires large numbers of resamples. Results We present a new approach to more efficiently assign resamples (such as bootstrap samples or permutations within a nonparametric multiple testing framework. We formulated a Bayesian-inspired approach to this problem, and devised an algorithm that adapts the assignment of resamples iteratively with negligible space and running time overhead. In two experimental studies, a breast cancer microarray dataset and a genome wide association study dataset for Parkinson's disease, we demonstrated that our differential allocation procedure is substantially more accurate compared to the traditional uniform resample allocation. Conclusion Our experiments demonstrate that using a more sophisticated allocation strategy can improve our inference for hypothesis testing without a drastic increase in the amount of computation on randomized data. Moreover, we gain more improvement in efficiency when the number of tests is large. R code for our algorithm and the shortcut method are available at http://people.pcbi.upenn.edu/~lswang/pub/bmc2009/.

  5. A Bayesian approach to efficient differential allocation for resampling-based significance testing.

    Science.gov (United States)

    Jensen, Shane T; Soi, Sameer; Wang, Li-San

    2009-06-28

    Large-scale statistical analyses have become hallmarks of post-genomic era biological research due to advances in high-throughput assays and the integration of large biological databases. One accompanying issue is the simultaneous estimation of p-values for a large number of hypothesis tests. In many applications, a parametric assumption in the null distribution such as normality may be unreasonable, and resampling-based p-values are the preferred procedure for establishing statistical significance. Using resampling-based procedures for multiple testing is computationally intensive and typically requires large numbers of resamples. We present a new approach to more efficiently assign resamples (such as bootstrap samples or permutations) within a nonparametric multiple testing framework. We formulated a Bayesian-inspired approach to this problem, and devised an algorithm that adapts the assignment of resamples iteratively with negligible space and running time overhead. In two experimental studies, a breast cancer microarray dataset and a genome wide association study dataset for Parkinson's disease, we demonstrated that our differential allocation procedure is substantially more accurate compared to the traditional uniform resample allocation. Our experiments demonstrate that using a more sophisticated allocation strategy can improve our inference for hypothesis testing without a drastic increase in the amount of computation on randomized data. Moreover, we gain more improvement in efficiency when the number of tests is large. R code for our algorithm and the shortcut method are available at http://people.pcbi.upenn.edu/~lswang/pub/bmc2009/.

  6. Applications of Bayesian approach in modelling risk of malaria-related hospital mortality

    Directory of Open Access Journals (Sweden)

    Simbeye Jupiter S

    2008-02-01

    Full Text Available Abstract Background Malaria is a major public health problem in Malawi, however, quantifying its burden in a population is a challenge. Routine hospital data provide a proxy for measuring the incidence of severe malaria and for crudely estimating morbidity rates. Using such data, this paper proposes a method to describe trends, patterns and factors associated with in-hospital mortality attributed to the disease. Methods We develop semiparametric regression models which allow joint analysis of nonlinear effects of calendar time and continuous covariates, spatially structured variation, unstructured heterogeneity, and other fixed covariates. Modelling and inference use the fully Bayesian approach via Markov Chain Monte Carlo (MCMC simulation techniques. The methodology is applied to analyse data arising from paediatric wards in Zomba district, Malawi, between 2002 and 2003. Results and Conclusion We observe that the risk of dying in hospital is lower in the dry season, and for children who travel a distance of less than 5 kms to the hospital, but increases for those who are referred to the hospital. The results also indicate significant differences in both structured and unstructured spatial effects, and the health facility effects reveal considerable differences by type of facility or practice. More importantly, our approach shows non-linearities in the effect of metrical covariates on the probability of dying in hospital. The study emphasizes that the methodological framework used provides a useful tool for analysing the data at hand and of similar structure.

  7. Robust modeling of differential gene expression data using normal/independent distributions: a Bayesian approach.

    Science.gov (United States)

    Ganjali, Mojtaba; Baghfalaki, Taban; Berridge, Damon

    2015-01-01

    In this paper, the problem of identifying differentially expressed genes under different conditions using gene expression microarray data, in the presence of outliers, is discussed. For this purpose, the robust modeling of gene expression data using some powerful distributions known as normal/independent distributions is considered. These distributions include the Student's t and normal distributions which have been used previously, but also include extensions such as the slash, the contaminated normal and the Laplace distributions. The purpose of this paper is to identify differentially expressed genes by considering these distributional assumptions instead of the normal distribution. A Bayesian approach using the Markov Chain Monte Carlo method is adopted for parameter estimation. Two publicly available gene expression data sets are analyzed using the proposed approach. The use of the robust models for detecting differentially expressed genes is investigated. This investigation shows that the choice of model for differentiating gene expression data is very important. This is due to the small number of replicates for each gene and the existence of outlying data. Comparison of the performance of these models is made using different statistical criteria and the ROC curve. The method is illustrated using some simulation studies. We demonstrate the flexibility of these robust models in identifying differentially expressed genes.

  8. Finding the optimal statistical model to describe target motion during radiotherapy delivery—a Bayesian approach

    Science.gov (United States)

    Herschtal, A.; Foroudi, F.; Greer, P. B.; Eade, T. N.; Hindson, B. R.; Kron, T.

    2012-05-01

    Early approaches to characterizing errors in target displacement during a fractionated course of radiotherapy assumed that the underlying fraction-to-fraction variability in target displacement, known as the ‘treatment error’ or ‘random error’, could be regarded as constant across patients. More recent approaches have modelled target displacement allowing for differences in random error between patients. However, until recently it has not been feasible to compare the goodness of fit of alternate models of random error rigorously. This is because the large volumes of real patient data necessary to distinguish between alternative models have only very recently become available. This work uses real-world displacement data collected from 365 patients undergoing radical radiotherapy for prostate cancer to compare five candidate models for target displacement. The simplest model assumes constant random errors across patients, while other models allow for random errors that vary according to one of several candidate distributions. Bayesian statistics and Markov Chain Monte Carlo simulation of the model parameters are used to compare model goodness of fit. We conclude that modelling the random error as inverse gamma distributed provides a clearly superior fit over all alternatives considered. This finding can facilitate more accurate margin recipes and correction strategies.

  9. Robust modeling of differential gene expression data using normal/independent distributions: a Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Mojtaba Ganjali

    Full Text Available In this paper, the problem of identifying differentially expressed genes under different conditions using gene expression microarray data, in the presence of outliers, is discussed. For this purpose, the robust modeling of gene expression data using some powerful distributions known as normal/independent distributions is considered. These distributions include the Student's t and normal distributions which have been used previously, but also include extensions such as the slash, the contaminated normal and the Laplace distributions. The purpose of this paper is to identify differentially expressed genes by considering these distributional assumptions instead of the normal distribution. A Bayesian approach using the Markov Chain Monte Carlo method is adopted for parameter estimation. Two publicly available gene expression data sets are analyzed using the proposed approach. The use of the robust models for detecting differentially expressed genes is investigated. This investigation shows that the choice of model for differentiating gene expression data is very important. This is due to the small number of replicates for each gene and the existence of outlying data. Comparison of the performance of these models is made using different statistical criteria and the ROC curve. The method is illustrated using some simulation studies. We demonstrate the flexibility of these robust models in identifying differentially expressed genes.

  10. A Bayesian approach for solar resource potential assessment using satellite images

    Science.gov (United States)

    Linguet, L.; Atif, J.

    2014-03-01

    The need for a more sustainable and more protective development opens new possibilities for renewable energy. Among the different renewable energy sources, the direct conversion of sunlight into electricity by solar photovoltaic (PV) technology seems to be the most promising and represents a technically viable solution to energy demands. But implantation and deployment of PV energy need solar resource data for utility planning, accommodating grid capacity, and formulating future adaptive policies. Currently, the best approach to determine the solar resource at a given site is based on the use of satellite images. However, the computation of solar resource (non-linear process) from satellite images is unfortunately not straightforward. From a signal processing point of view, it falls within non-stationary, non-linear/non-Gaussian dynamical inverse problems. In this paper, we propose a Bayesian approach combining satellite images and in situ data. We propose original observation and transition functions taking advantages of the characteristics of both the involved type of data. A simulation study of solar irradiance is carried along with this method and a French Guiana solar resource potential map for year 2010 is given.

  11. Modelling multi-hazard hurricane damages on an urbanized coast with a Bayesian Network approach

    Science.gov (United States)

    van Verseveld, H.C.W.; Van Dongeren, A. R.; Plant, Nathaniel G.; Jäger, W.S.; den Heijer, C.

    2015-01-01

    Hurricane flood impacts to residential buildings in coastal zones are caused by a number of hazards, such as inundation, overflow currents, erosion, and wave attack. However, traditional hurricane damage models typically make use of stage-damage functions, where the stage is related to flooding depth only. Moreover, these models are deterministic and do not consider the large amount of uncertainty associated with both the processes themselves and with the predictions. This uncertainty becomes increasingly important when multiple hazards (flooding, wave attack, erosion, etc.) are considered simultaneously. This paper focusses on establishing relationships between observed damage and multiple hazard indicators in order to make better probabilistic predictions. The concept consists of (1) determining Local Hazard Indicators (LHIs) from a hindcasted storm with use of a nearshore morphodynamic model, XBeach, and (2) coupling these LHIs and building characteristics to the observed damages. We chose a Bayesian Network approach in order to make this coupling and used the LHIs ‘Inundation depth’, ‘Flow velocity’, ‘Wave attack’, and ‘Scour depth’ to represent flooding, current, wave impacts, and erosion related hazards.The coupled hazard model was tested against four thousand damage observations from a case site at the Rockaway Peninsula, NY, that was impacted by Hurricane Sandy in late October, 2012. The model was able to accurately distinguish ‘Minor damage’ from all other outcomes 95% of the time and could distinguish areas that were affected by the storm, but not severely damaged, 68% of the time. For the most heavily damaged buildings (‘Major Damage’ and ‘Destroyed’), projections of the expected damage underestimated the observed damage. The model demonstrated that including multiple hazards doubled the prediction skill, with Log-Likelihood Ratio test (a measure of improved accuracy and reduction in uncertainty) scores between 0.02 and 0

  12. Designing Groundwater Monitoring Networks for Regional-Scale Water Quality Assessment: A Bayesian Approach

    Science.gov (United States)

    Pinto, M. J.; Wagner, B. J.

    2002-12-01

    The design of groundwater monitoring networks is an important concern of regional-scale water-quality assessment programs because of the high cost of data collection. The work presented here addresses regional-scale design issues using ground-water simulation and optimization set within a Bayesian framework. The regional-scale design approach focuses on reducing the uncertainty associated with a fundamental quantity: the proportion of a subsurface water resource which exceeds a specified threshold concentration, such as a mandated maximum contaminant level. This proportion is hereafter referred to as the threshold proportion. The goal is to identify optimal or near-optimal sampling designs that reduce the threshold proportion uncertainty to an acceptable level. In the Bayesian approach, there is a probability density function (pdf) associated with the unknown threshold proportion before sampling. This function is known as the prior pdf. The form of the prior pdf, which is dependent on the information available regarding the distribution of water quality within the aquifer system, controls the amount of sampling needed. In the absence of information, the form of the prior pdf is uniform; however, if a ground-water flow and transport model is available, a Monte Carlo analysis of ground-water flow and transport simulations can be used to generate a prior pdf which is non-uniform and which contains the information available regarding solute sources, pathways and transport. After sampling, the prior pdf is conditioned on the sampling data. The conditional distribution is known as the posterior pdf. In most cases there is a reduction in uncertainty associated with conditioning. The reduction in uncertainty achieved after collecting samples can be explored for different combinations of prior pdf distribution and sampling method. Three scenarios are considered: (i) uniform prior pdf with random sampling; (ii) non-uniform prior pdf with random sampling; and (iii) non

  13. A Bayesian approach for estimating calibration curves and unknown concentrations in immunoassays

    Science.gov (United States)

    Feng, Feng; Sales, Ana Paula; Kepler, Thomas B.

    2011-01-01

    Motivation: Immunoassays are primary diagnostic and research tools throughout the medical and life sciences. The common approach to the processing of immunoassay data involves estimation of the calibration curve followed by inversion of the calibration function to read off the concentration estimates. This approach, however, does not lend itself easily to acceptable estimation of confidence limits on the estimated concentrations. Such estimates must account for uncertainty in the calibration curve as well as uncertainty in the target measurement. Even point estimates can be problematic: because of the non-linearity of calibration curves and error heteroscedasticity, the neglect of components of measurement error can produce significant bias. Methods: We have developed a Bayesian approach for the estimation of concentrations from immunoassay data that treats the propagation of measurement error appropriately. The method uses Markov Chain Monte Carlo (MCMC) to approximate the posterior distribution of the target concentrations and numerically compute the relevant summary statistics. Software implementing the method is freely available for public use. Results: The new method was tested on both simulated and experimental datasets with different measurement error models. The method outperformed the common inverse method on samples with large measurement errors. Even in cases with extreme measurements where the common inverse method failed, our approach always generated reasonable estimates for the target concentrations. Availability: Project name: Baecs; Project home page: www.computationalimmunology.org/utilities/; Operating systems: Linux, MacOS X and Windows; Programming language: C++; License: Free for Academic Use. Contact: feng.feng@duke.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21149344

  14. Hierarchical Mergence Approach to Cell Detection in Phase Contrast Microscopy Images

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2014-01-01

    Full Text Available Phase contrast microscope is one of the most universally used instruments to observe long-term cell movements in different solutions. Most of classic segmentation methods consider a homogeneous patch as an object, while the recorded cell images have rich details and a lot of small inhomogeneous patches, as well as some artifacts, which can impede the applications. To tackle these challenges, this paper presents a hierarchical mergence approach (HMA to extract homogeneous patches out and heuristically add them up. Initially, the maximum region of interest (ROI, in which only cell events exist, is drawn by using gradient information as a mask. Then, different levels of blurring based on kernel or grayscale morphological operations are applied to the whole image to produce reference images. Next, each of unconnected regions in the mask is applied with Otsu method independently according to different reference images. Consequently, the segmentation result is generated by the combination of usable patches in all informative layers. The proposed approach is more than simply a fusion of the basic segmentation methods, but a well-organized strategy that integrates these basic methods. Experiments demonstrate that the proposed method outperforms previous methods within our datasets.

  15. Hierarchical classifier approach to physical activity recognition via wearable smartphone tri-axial accelerometer.

    Science.gov (United States)

    Yusuf, Feridun; Maeder, Anthony; Basilakis, Jim

    2013-01-01

    Physical activity recognition has emerged as an active area of research which has drawn increasing interest from researchers in a variety of fields. It can support many different applications such as safety surveillance, fraud detection, and clinical management. Accelerometers have emerged as the most useful and extensive tool to capture and assess human physical activities in a continuous, unobtrusive and reliable manner. The need for objective physical activity data arises strongly in health related research. With the shift to a sedentary lifestyle, where work and leisure tend to be less physically demanding, research on the health effects of low physical activity has become a necessity. The increased availability of small, inexpensive components has led to the development of mobile devices such as smartphones, providing platforms for new opportunities in healthcare applications. In this study 3 subjects performed directed activity routines wearing a smartphone with a built in tri-axial accelerometer, attached on a belt around the waist. The data was collected to classify 11 basic physical activities such as sitting, lying, standing, walking, and the transitions in between them. A hierarchical classifier approach was utilised with Artificial Neural Networks integrated in a rule-based system, to classify the activities. Based on our evaluation, recognition accuracy of over 89.6% between subjects and over 91.5% within subject was achieved. These results show that activities such as these can be recognised with a high accuracy rate; hence the approach is promising for use in future work.

  16. A new approach for modeling generalization gradients: a case for hierarchical models

    Science.gov (United States)

    Vanbrabant, Koen; Boddez, Yannick; Verduyn, Philippe; Mestdagh, Merijn; Hermans, Dirk; Raes, Filip

    2015-01-01

    A case is made for the use of hierarchical models in the analysis of generalization gradients. Hierarchical models overcome several restrictions that are imposed by repeated measures analysis-of-variance (rANOVA), the default statistical method in current generalization research. More specifically, hierarchical models allow to include continuous independent variables and overcomes problematic assumptions such as sphericity. We focus on how generalization research can benefit from this added flexibility. In a simulation study we demonstrate the dominance of hierarchical models over rANOVA. In addition, we show the lack of efficiency of the Mauchly's sphericity test in sample sizes typical for generalization research, and confirm how violations of sphericity increase the probability of type I errors. A worked example of a hierarchical model is provided, with a specific emphasis on the interpretation of parameters relevant for generalization research. PMID:26074834

  17. A Bayesian approach to modelling heterogeneous calcium responses in cell populations.

    Directory of Open Access Journals (Sweden)

    Agne Tilūnaitė

    2017-10-01

    Full Text Available Calcium responses have been observed as spikes of the whole-cell calcium concentration in numerous cell types and are essential for translating extracellular stimuli into cellular responses. While there are several suggestions for how this encoding is achieved, we still lack a comprehensive theory. To achieve this goal it is necessary to reliably predict the temporal evolution of calcium spike sequences for a given stimulus. Here, we propose a modelling framework that allows us to quantitatively describe the timing of calcium spikes. Using a Bayesian approach, we show that Gaussian processes model calcium spike rates with high fidelity and perform better than standard tools such as peri-stimulus time histograms and kernel smoothing. We employ our modelling concept to analyse calcium spike sequences from dynamically-stimulated HEK293T cells. Under these conditions, different cells often experience diverse stimulus time courses, which is a situation likely to occur in vivo. This single cell variability and the concomitant small number of calcium spikes per cell pose a significant modelling challenge, but we demonstrate that Gaussian processes can successfully describe calcium spike rates in these circumstances. Our results therefore pave the way towards a statistical description of heterogeneous calcium oscillations in a dynamic environment.

  18. Carbon isotope discrimination during branch photosynthesis of Fagus sylvatica: a Bayesian modelling approach.

    Science.gov (United States)

    Gentsch, Lydia; Hammerle, Albin; Sturm, Patrick; Ogée, Jérôme; Wingate, Lisa; Siegwolf, Rolf; Plüss, Peter; Baur, Thomas; Buchmann, Nina; Knohl, Alexander

    2014-07-01

    Field measurements of photosynthetic carbon isotope discrimination ((13)Δ) of Fagus sylvatica, conducted with branch bags and laser spectrometry, revealed a high variability of (13)Δ, both on diurnal and day-to-day timescales. We tested the prediction capability of three versions of a commonly used model for (13)Δ [called here comprehensive ((13)(Δcomp)), simplified ((13) Δsimple) and revised ((13)(Δrevised)) versions]. A Bayesian approach was used to calibrate major model parameters. Constrained estimates were found for the fractionation during CO(2) fixation in (13)(Δcomp), but not in (13)(Δsimple), and partially for the mesophyll conductance for CO(2)(gi). No constrained estimates were found for fractionations during mitochondrial and photorespiration, and for a diurnally variable apparent fractionation between current assimilates and mitochondrial respiration, specific to (13)(Δrevised). A quantification of parameter estimation uncertainties and interdependencies further helped explore model structure and behaviour. We found that (13)(Δcomp) usually outperformed (13)(Δsimple) because of the explicit consideration of gi and the photorespiratory fractionation in (13)(Δcomp) that enabled a better description of the large observed diurnal variation (≈9‰) of (13)Δ. Flux-weighted daily means of (13)Δ were also better predicted with (13)(Δcomp) than with (13)(Δsimple). © 2013 John Wiley & Sons Ltd.

  19. Risk assessment of pre-hospital trauma airway management by anaesthesiologists using the predictive Bayesian approach

    Directory of Open Access Journals (Sweden)

    Nakstad Anders R

    2010-04-01

    Full Text Available Abstract Introduction Endotracheal intubation (ETI has been considered an essential part of pre-hospital advanced life support. Pre-hospital ETI, however, is a complex intervention also for airway specialist like anaesthesiologists working as pre-hospital emergency physicians. We therefore wanted to investigate the quality of pre-hospital airway management by anaesthesiologists in severely traumatised patients and identify possible areas for improvement. Method We performed a risk assessment according to the predictive Bayesian approach, in a typical anaesthesiologist-manned Norwegian helicopter emergency medical service (HEMS. The main focus of the risk assessment was the event where a patient arrives in the emergency department without ETI despite a pre-hospital indication for it. Results In the risk assessment, we assigned a high probability (29% for the event assessed, that a patient arrives without ETI despite a pre-hospital indication. However, several uncertainty factors in the risk assessment were identified related to data quality, indications for use of ETI, patient outcome and need for special training of ETI providers. Conclusion Our risk assessment indicated a high probability for trauma patients with an indication for pre-hospital ETI not receiving it in the studied HEMS. The uncertainty factors identified in the assessment should be further investigated to better understand the problem assessed and consequences for the patients. Better quality of pre-hospital airway management data could contribute to a reduction of these uncertainties.

  20. Receiver-based recovery of clipped ofdm signals for papr reduction: A bayesian approach

    KAUST Repository

    Ali, Anum

    2014-01-01

    Clipping is one of the simplest peak-to-average power ratio reduction schemes for orthogonal frequency division multiplexing (OFDM). Deliberately clipping the transmission signal degrades system performance, and clipping mitigation is required at the receiver for information restoration. In this paper, we acknowledge the sparse nature of the clipping signal and propose a low-complexity Bayesian clipping estimation scheme. The proposed scheme utilizes a priori information about the sparsity rate and noise variance for enhanced recovery. At the same time, the proposed scheme is robust against inaccurate estimates of the clipping signal statistics. The undistorted phase property of the clipped signal, as well as the clipping likelihood, is utilized for enhanced reconstruction. Furthermore, motivated by the nature of modern OFDM-based communication systems, we extend our clipping reconstruction approach to multiple antenna receivers and multi-user OFDM.We also address the problem of channel estimation from pilots contaminated by the clipping distortion. Numerical findings are presented that depict favorable results for the proposed scheme compared to the established sparse reconstruction schemes.

  1. A Parallel and Incremental Approach for Data-Intensive Learning of Bayesian Networks.

    Science.gov (United States)

    Yue, Kun; Fang, Qiyu; Wang, Xiaoling; Li, Jin; Liu, Weiyi

    2015-12-01

    Bayesian network (BN) has been adopted as the underlying model for representing and inferring uncertain knowledge. As the basis of realistic applications centered on probabilistic inferences, learning a BN from data is a critical subject of machine learning, artificial intelligence, and big data paradigms. Currently, it is necessary to extend the classical methods for learning BNs with respect to data-intensive computing or in cloud environments. In this paper, we propose a parallel and incremental approach for data-intensive learning of BNs from massive, distributed, and dynamically changing data by extending the classical scoring and search algorithm and using MapReduce. First, we adopt the minimum description length as the scoring metric and give the two-pass MapReduce-based algorithms for computing the required marginal probabilities and scoring the candidate graphical model from sample data. Then, we give the corresponding strategy for extending the classical hill-climbing algorithm to obtain the optimal structure, as well as that for storing a BN by pairs. Further, in view of the dynamic characteristics of the changing data, we give the concept of influence degree to measure the coincidence of the current BN with new data, and then propose the corresponding two-pass MapReduce-based algorithms for BNs incremental learning. Experimental results show the efficiency, scalability, and effectiveness of our methods.

  2. A Robust Bayesian Approach to an Optimal Replacement Policy for Gas Pipelines

    Directory of Open Access Journals (Sweden)

    José Pablo Arias-Nicolás

    2015-06-01

    Full Text Available In the paper, we address Bayesian sensitivity issues when integrating experts’ judgments with available historical data in a case study about strategies for the preventive maintenance of low-pressure cast iron pipelines in an urban gas distribution network. We are interested in replacement priorities, as determined by the failure rates of pipelines deployed under different conditions. We relax the assumptions, made in previous papers, about the prior distributions on the failure rates and study changes in replacement priorities under different choices of generalized moment-constrained classes of priors. We focus on the set of non-dominated actions, and among them, we propose the least sensitive action as the optimal choice to rank different classes of pipelines, providing a sound approach to the sensitivity problem. Moreover, we are also interested in determining which classes have a failure rate exceeding a given acceptable value, considered as the threshold determining no need for replacement. Graphical tools are introduced to help decisionmakers to determine if pipelines are to be replaced and the corresponding priorities.

  3. An Integrated Approach to Battery Health Monitoring using Bayesian Regression, Classification and State Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — The application of the Bayesian theory of managing uncertainty and complexity to regression and classification in the form of Relevance Vector Machine (RVM), and to...

  4. Robust Bayesian super-resolution approach via sparsity enforcing a priori for near-field aeroacoustic source imaging

    Science.gov (United States)

    Chu, Ning; Mohammad-Djafari, Ali; Picheral, José

    2013-09-01

    Near-field aeroacoustic imaging has been the focus of great attentions of researchers and engineers in aeroacoustic source localization and power estimation for decades. Recently the deconvolution and regularization methods have greatly improved spatial resolution of the beamforming methods. But neither are they robust to background noises in the low Signal-to-Noise Ratio (SNR) situation, nor do they provide a wide dynamic range of power estimation. In this paper, we first propose an improved forward model of aeroacoustic power propagation, in which, we consider background noises and forward model uncertainty for the robustness. To solve the inverse problem, we then propose a robust Bayesian super-resolution approach via sparsity enforcing a priori. The sparse prior of source powers can be modeled by double exponential distribution, which can improve the spatial resolution and promote wide dynamic range of source powers. Both the hyperparameters and source powers can be alternatively estimated by the Bayesian inference approach based on the joint Maximum A Priori optimization. Finally our Bayesian approach is compared with some of the state-of-the-art methods on simulated, real and hybrid data. The main advantages of our approach are of robustness to noise, a wide dynamic range, super spatial resolution, and non-necessity for prior knowledge of the source number or SNR. It is feasible to apply it for aeroacoustic imaging with the 2D non-uniform microphone array in wind tunnel tests, especially for near-field monopole and extended source imaging.

  5. International Stock Market Efficiency: A Non-Bayesian Time-Varying Model Approach

    OpenAIRE

    Mikio Ito; Akihiko Noda; Tatsuma Wada

    2012-01-01

    This paper develops a non-Bayesian methodology to analyze the time-varying structure of international linkages and market efficiency in G7 countries. We consider a non-Bayesian time-varying vector autoregressive (TV-VAR) model, and apply it to estimate the joint degree of market efficiency in the sense of Fama (1970, 1991). Our empirical results provide a new perspective that the international linkages and market efficiency change over time and that their behaviors correspond well to historic...

  6. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2017-04-12

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  7. Improved Membership Probability for Moving Groups: Bayesian and Machine Learning Approaches

    Science.gov (United States)

    Lee, Jinhee; Song, Inseok

    2018-01-01

    Gravitationally unbound loose stellar associations (i.e., young nearby moving groups: moving groups hereafter) have been intensively explored because they are important in planet and disk formation studies, exoplanet imaging, and age calibration. Among the many efforts devoted to the search for moving group members, a Bayesian approach (e.g.,using the code BANYAN) has become popular recently because of the many advantages it offers. However, the resultant membership probability needs to be carefully adopted because of its sensitive dependence on input models. In this study, we have developed an improved membership calculation tool focusing on the beta-Pic moving group. We made three improvements for building models used in BANYAN II: (1) updating a list of accepted members by re-assessing memberships in terms of position, motion, and age, (2) investigating member distribution functions in XYZ, and (3) exploring field star distribution functions in XYZUVW. Our improved tool can change membership probability up to 70%. Membership probability is critical and must be better defined. For example, our code identifies only one third of the candidate members in SIMBAD that are believed to be kinematically associated with beta-Pic moving group.Additionally, we performed cluster analysis of young nearby stars using an unsupervised machine learning approach. As more moving groups and their members are identified, the complexity and ambiguity in moving group configuration has been increased. To clarify this issue, we analyzed ~4,000 X-ray bright young stellar candidates. Here, we present the preliminary results. By re-identifying moving groups with the least human intervention, we expect to understand the composition of the solar neighborhood. Moreover better defined moving group membership will help us understand star formation and evolution in relatively low density environments; especially for the low-mass stars which will be identified in the coming Gaia release.

  8. Hierarchical Decomposition Thermodynamic Approach for the Study of Solar Absorption Refrigerator Performance

    Directory of Open Access Journals (Sweden)

    Emma Berrich Betouche

    2016-03-01

    Full Text Available A thermodynamic approach based on the hierarchical decomposition which is usually used in mechanical structure engineering is proposed. The methodology is applied to an absorption refrigeration cycle. Thus, a thermodynamic analysis of the performances on solar absorption refrigerators is presented. Under the hypothesis of an endoreversible model, the effects of the generator, the solar concentrator and the solar converter temperatures, on the coefficient of performance (COP, are presented and discussed. In fact, the coefficient of performance variations, according to the ratio of the heat transfer areas of the high temperature part (the thermal engine 2 Ah and the heat transfer areas of the low temperature part (the thermal receptor Ar variations, are studied in this paper. For low values of the heat-transfer areas of the high temperature part and relatively important values of heat-transfer areas of the low temperature part as for example Ah equal to 30% of Ar, the coefficient of performance is relatively important (approximately equal to 65%. For an equal-area distribution corresponding to an area ratio Ah/Ar of 50%, the COP is approximately equal to 35%. The originality of this deduction is that it allows a conceptual study of the solar absorption cycle.

  9. Impact of food, housing, and transportation insecurity on ART adherence: a hierarchical resources approach.

    Science.gov (United States)

    Cornelius, Talea; Jones, Maranda; Merly, Cynthia; Welles, Brandi; Kalichman, Moira O; Kalichman, Seth C

    2017-04-01

    Antiretroviral therapy (ART) has transformed HIV into a manageable illness. However, high levels of adherence must be maintained. Lack of access to basic resources (food, transportation, and housing) has been consistently associated with suboptimal ART adherence. Moving beyond such direct effects, this study takes a hierarchical resources approach in which the effects of access to basic resources on ART adherence are mediated through interpersonal resources (social support and care services) and personal resources (self-efficacy). Participants were 915 HIV-positive men and women living in Atlanta, GA, recruited from community centers and infectious disease clinics. Participants answered baseline questionnaires, and provided prospective data on ART adherence. Across a series of nested models, a consistent pattern emerged whereby lack of access to basic resources had indirect, negative effects on adherence, mediated through both lack of access to social support and services, and through lower treatment self-efficacy. There was also a significant direct effect of lack of access to transportation on adherence. Lack of access to basic resources negatively impacts ART adherence. Effects for housing instability and food insecurity were fully mediated through social support, access to services, and self-efficacy, highlighting these as important targets for intervention. Targeting service supports could be especially beneficial due to the potential to both promote adherence and to link clients with other services to supplement food, housing, and transportation. Inability to access transportation had a direct negative effect on adherence, suggesting that free or reduced cost transportation could positively impact ART adherence among disadvantaged populations.

  10. Association between parental guilt and oral health problems in preschool children: a hierarchical approach.

    Science.gov (United States)

    Gomes, Monalisa Cesarino; Clementino, Marayza Alves; Pinto-Sarmento, Tassia Cristina de Almeida; Martins, Carolina Castro; Granville-Garcia, Ana Flávia; Paiva, Saul Martins

    2014-08-16

    Dental caries and traumatic dental injury (TDI) can play an important role in the emergence of parental guilt, since parents feel responsible for their child's health. The aim of the present study was to evaluate the influence of oral health problems among preschool children on parental guilt. A preschool-based, cross-sectional study was carried out with 832 preschool children between three and five years of age in the city of Campina Grande, Brazil. Parents/caregivers answered the Brazilian version of the Early Childhood Oral Health Impact Scale (B-ECOHIS). The item "parental guilt" was the dependent variable. Questionnaires addressing socio-demographic variables (child's sex, child's age, parent's/caregiver's age, mother's schooling, type of preschool and household income), history of toothache and health perceptions (general and oral) were also administered. Clinical exams for dental caries and TDI were performed by three dentists who had undergone a training and calibration exercise (Kappa: 0.85-0.90). Poisson hierarchical regression was used to determine the significance of associations between parental guilt and oral health problems (α = 5%). The multivariate model was carried out on three levels using a hierarchical approach from distal to proximal determinants: 1) socio-demographic aspects; 2) health perceptions; and 3) oral health problems. The frequency of parental guilt was 22.8%. The following variables were significantly associated with parental guilt: parental perception of child's oral health as poor (PR = 2.010; 95% CI: 1.502-2.688), history of toothache (PR = 2.344; 95% CI: 1.755-3.130), cavitated lesions (PR = 2.002; 95% CI: 1.388-2.887), avulsion/luxation (PR = 2.029; 95% CI: 1.141-3.610) and tooth discoloration (PR = 1.540; 95% CI: 1.169-2.028). Based on the present findings, parental guilt increases with the occurrence of oral health problems that require treatment, such as dental caries and TDI of greater severity. Parental perceptions of

  11. Hierarchical eco-restoration: A systematical approach to removal of COD and dissolved nutrients from an intensive agricultural area

    Energy Technology Data Exchange (ETDEWEB)

    Wu Yonghong, E-mail: yhwu@issas.ac.c [State Key Laboratory of Soil and Sustainable Agriculture, Institute of Soil Science, Chinese Academy of Sciences, 71 Beijing East Road, Nanjing 210008 (China); Graduate Schools, Chinese Academy of Sciences, Beijing 100049 (China); Hu Zhengyi [Graduate Schools, Chinese Academy of Sciences, Beijing 100049 (China); Yang Linzhang, E-mail: lzyang@issas.ac.c [State Key Laboratory of Soil and Sustainable Agriculture, Institute of Soil Science, Chinese Academy of Sciences, 71 Beijing East Road, Nanjing 210008 (China)

    2010-10-15

    A systematical approach based on hierarchical eco-restoration system for the simultaneous removal of COD and dissolved nutrients was proposed and applied in a complex residential-cropland area in Kunming, China from August 2006 to August 2008, where the self-purifying capacity of the agricultural ecosystem had been lost. The system includes four main parts: (1) fertilizer management and agricultural structure optimization, (2) nutrients reuse, (3) wastewater treatment, and (4) catchment restoration. The results showed that the average removal efficiencies were 90% for COD, 93% for ammonia, 94% for nitrate and 71% for total dissolved phosphorus (TDP) when the hierarchical eco-restoration agricultural system was in a relatively steady-state condition. The emergence of 14 species of macrophytes and 4 species of zoobenthos indicated that the growth conditions for the plankton were improved. The results demonstrated that this promising and environmentally benign hierarchical eco-restoration system could decrease the output of nutrients and reduce downstream eutrophication risk. - A systematical approach based on hierarchical eco-restoration system has proven highly effective for simultaneously removing COD and dissolved nutrients, decreasing the output of nutrients, and reducing the eutrophic risk of downstream surface waters.

  12. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data

    KAUST Repository

    Sepúlveda, Nuno

    2013-02-26

    Background: The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model.Results: Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates.Conclusions: In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data. 2013 Seplveda et al.; licensee BioMed Central Ltd.

  13. A Poisson hierarchical modelling approach to detecting copy number variation in sequence coverage data.

    Science.gov (United States)

    Sepúlveda, Nuno; Campino, Susana G; Assefa, Samuel A; Sutherland, Colin J; Pain, Arnab; Clark, Taane G

    2013-02-26

    The advent of next generation sequencing technology has accelerated efforts to map and catalogue copy number variation (CNV) in genomes of important micro-organisms for public health. A typical analysis of the sequence data involves mapping reads onto a reference genome, calculating the respective coverage, and detecting regions with too-low or too-high coverage (deletions and amplifications, respectively). Current CNV detection methods rely on statistical assumptions (e.g., a Poisson model) that may not hold in general, or require fine-tuning the underlying algorithms to detect known hits. We propose a new CNV detection methodology based on two Poisson hierarchical models, the Poisson-Gamma and Poisson-Lognormal, with the advantage of being sufficiently flexible to describe different data patterns, whilst robust against deviations from the often assumed Poisson model. Using sequence coverage data of 7 Plasmodium falciparum malaria genomes (3D7 reference strain, HB3, DD2, 7G8, GB4, OX005, and OX006), we showed that empirical coverage distributions are intrinsically asymmetric and overdispersed in relation to the Poisson model. We also demonstrated a low baseline false positive rate for the proposed methodology using 3D7 resequencing data and simulation. When applied to the non-reference isolate data, our approach detected known CNV hits, including an amplification of the PfMDR1 locus in DD2 and a large deletion in the CLAG3.2 gene in GB4, and putative novel CNV regions. When compared to the recently available FREEC and cn.MOPS approaches, our findings were more concordant with putative hits from the highest quality array data for the 7G8 and GB4 isolates. In summary, the proposed methodology brings an increase in flexibility, robustness, accuracy and statistical rigour to CNV detection using sequence coverage data.

  14. Stakeholder perceptions of soil managements in the Canyoles watershed. A Bayesian Belief Network approach

    Science.gov (United States)

    Burguet Marimón, Maria; Quinn, Claire; Stringer, Lindsay; Cerdà, Artemi

    2017-04-01

    not fight against these problems as, on the one hand, they do not realize that non-sustainable soil erosion rates reduce soil fertility, and, on the other hand, there are several cultural issues that guide them towards bare soil as they find this as a tidy way to keep their properties. However, more research needs to be done on the BBN approach in order to be able to have a holistic approach regarding the vision of the farmers concerning the use of the different soil conservation strategies. Acknowledgements. The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 603498 (RECARE project). References Cain, J. 2001. Planning improvements in natural resources management: Guidelines for using Bayesian networks to support the planning and management of development programmes in the water sector and beyond. Centre for Ecology & Hydrology, Wallingford, UK. Marques, M. J., R. Bienes, J. Cuadrado, M. Ruiz-Colmenero, C. Barbero-Sierra, and A. Velasco. 2015. Analysing Perceptions Attitudes and Responses of Winegrowers about Sustainable Land Management in Central Spain. Land Degradation and Development 26 (5): 458-467. doi:10.1002/ldr.2355. Tengberg, A., F. Radstake, K. Zhang, and B. Dunn. 2016. Scaling Up of Sustainable Land Management in the Western People's Republic of China: Evaluation of a 10-Year Partnership. Land Degradation and Development 27 (2): 134-144. doi:10.1002/ldr.2270. Teshome, A., J. de Graaff, C. Ritsema, and M. Kassie. 2016. Farmers' Perceptions about the Influence of Land Quality, Land Fragmentation and Tenure Systems on Sustainable Land Management in the North Western Ethiopian Highlands. Land Degradation and Development 27 (4): 884-898. doi:10.1002/ldr.2298.

  15. Improving the quantification of contrast enhanced ultrasound using a Bayesian approach

    Science.gov (United States)

    Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico

    2017-03-01

    Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity, that can be useful in the quantification of different perfusion patterns. This can be particularly important in the early detection and staging of arthritis. In a recent study we have shown that a Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. Moreover, we have shown that through a pixel-by-pixel analysis the quantitative information gathered characterizes more effectively the perfusion. However, the SNR ratio of the data and the nonlinearity of the model makes the parameter estimation difficult. Using classical non-linear-leastsquares (NLLS) approach the number of unreliable estimates (those with an asymptotic coefficient of variation greater than a user-defined threshold) is significant, thus affecting the overall description of the perfusion kinetics and of its heterogeneity. In this work we propose to solve the parameter estimation at the pixel level within a Bayesian framework using Variational Bayes (VB), and an automatic and data-driven prior initialization. When evaluating the pixels for which both VB and NLLS provided reliable estimates, we demonstrated that the parameter values provided by the two methods are well correlated (Pearson's correlation between 0.85 and 0.99). Moreover, the mean number of unreliable pixels drastically reduces from 54% (NLLS) to 26% (VB), without increasing the computational time (0.05 s/pixel for NLLS and 0.07 s/pixel for VB). When considering the efficiency of the algorithms as computational time per reliable estimate, VB outperforms NLLS (0.11 versus 0.25 seconds per reliable estimate respectively).

  16. Assessment of Earthquake Hazard Parameters with Bayesian Approach Method Around Karliova Triple Junction, Eastern Turkey

    Science.gov (United States)

    Türker, Tugba; Bayrak, Yusuf

    2017-12-01

    In this study, the Bayesian Approach method is used to evaluate earthquake hazard parameters of maximum regional magnitude (Mmax), β value, and seismic activity rate or intensity (λ) and their uncertainties for next 5, 10, 25, 50, 100 years around Karlıova Triple Junction (KTJ). A compiled earthquake catalog that is homogenous for Ms ≥ 3.0 was completed during the period from 1900 to 2017. We are divided into four different seismic source regions based on epicenter distribution, tectonic, seismicity, faults around KTJ. We two historical earthquakes (1866, Ms=7.2 for Region 3 (Between Bingöl-Karlıova-Muş-Bitlis (Bahçeköy Fault Zone-Uzunpınar Fault Zone-Karakoçan Fault-Muę Fault Zones –Kavakbaşı Fault)) and 1874, Ms=7.1 for Region 4 (Between Malatya-Elaziğ-Tunceli (Palu Basin-Pütürge Basin-Erkenek Fault-Malatya Fault)) are included around KTJ. The computed Mmax values are between 7.71 and 8.17. The quantiles of functions of distributions of true and apparent magnitude on a given time interval [0, T] are evaluated. The quantiles of functions of distributions of apparent and true magnitudes for next time intervals of 5, 10, 25, 50, and 100 years are calculated for confidence limits of probability levels of 50, 70, and 90 % around KTJ. According to the computed earthquake hazard parameters, Erzincan Basin-Ovacık Fault-Pülümur Fault-Yedisu Basin region was the most seismic active regions of KTJ. Erzincan Basin-Ovacik Fault-Pulumur Fault-Yedisu Basin region is estimated the highest earthquake magnitude 7.16 with a 90 % probability level in the next 100 years which the most dangerous region compared to other regions. The results of this study can be used in earthquake hazard studies of the East Anatolian region.

  17. A Bayesian kriging approach for blending satellite and ground precipitation observations

    Science.gov (United States)

    Verdin, Andrew P.; Rajagopalan, Balaji; Kleiber, William; Funk, Christopher C.

    2015-01-01

    Drought and flood management practices require accurate estimates of precipitation. Gauge observations, however, are often sparse in regions with complicated terrain, clustered in valleys, and of poor quality. Consequently, the spatial extent of wet events is poorly represented. Satellite-derived precipitation data are an attractive alternative, though they tend to underestimate the magnitude of wet events due to their dependency on retrieval algorithms and the indirect relationship between satellite infrared observations and precipitation intensities. Here we offer a Bayesian kriging approach for blending precipitation gauge data and the Climate Hazards Group Infrared Precipitation satellite-derived precipitation estimates for Central America, Colombia, and Venezuela. First, the gauge observations are modeled as a linear function of satellite-derived estimates and any number of other variables—for this research we include elevation. Prior distributions are defined for all model parameters and the posterior distributions are obtained simultaneously via Markov chain Monte Carlo sampling. The posterior distributions of these parameters are required for spatial estimation, and thus are obtained prior to implementing the spatial kriging model. This functional framework is applied to model parameters obtained by sampling from the posterior distributions, and the residuals of the linear model are subject to a spatial kriging model. Consequently, the posterior distributions and uncertainties of the blended precipitation estimates are obtained. We demonstrate this method by applying it to pentadal and monthly total precipitation fields during 2009. The model's performance and its inherent ability to capture wet events are investigated. We show that this blending method significantly improves upon the satellite-derived estimates and is also competitive in its ability to represent wet events. This procedure also provides a means to estimate a full conditional distribution

  18. A New Approach for Obtaining Cosmological Constraints from Type Ia Supernovae using Approximate Bayesian Computation

    Energy Technology Data Exchange (ETDEWEB)

    Jennings, Elise; Wolf, Rachel; Sako, Masao

    2016-11-09

    Cosmological parameter estimation techniques that robustly account for systematic measurement uncertainties will be crucial for the next generation of cosmological surveys. We present a new analysis method, superABC, for obtaining cosmological constraints from Type Ia supernova (SN Ia) light curves using Approximate Bayesian Computation (ABC) without any likelihood assumptions. The ABC method works by using a forward model simulation of the data where systematic uncertainties can be simulated and marginalized over. A key feature of the method presented here is the use of two distinct metrics, the `Tripp' and `Light Curve' metrics, which allow us to compare the simulated data to the observed data set. The Tripp metric takes as input the parameters of models fit to each light curve with the SALT-II method, whereas the Light Curve metric uses the measured fluxes directly without model fitting. We apply the superABC sampler to a simulated data set of $\\sim$1000 SNe corresponding to the first season of the Dark Energy Survey Supernova Program. Varying $\\Omega_m, w_0, \\alpha$ and $\\beta$ and a magnitude offset parameter, with no systematics we obtain $\\Delta(w_0) = w_0^{\\rm true} - w_0^{\\rm best \\, fit} = -0.036\\pm0.109$ (a $\\sim11$% 1$\\sigma$ uncertainty) using the Tripp metric and $\\Delta(w_0) = -0.055\\pm0.068$ (a $\\sim7$% 1$\\sigma$ uncertainty) using the Light Curve metric. Including 1% calibration uncertainties in four passbands, adding 4 more parameters, we obtain $\\Delta(w_0) = -0.062\\pm0.132$ (a $\\sim14$% 1$\\sigma$ uncertainty) using the Tripp metric. Overall we find a $17$% increase in the uncertainty on $w_0$ with systematics compared to without. We contrast this with a MCMC approach where systematic effects are approximately included. We find that the MCMC method slightly underestimates the impact of calibration uncertainties for this simulated data set.

  19. BClass: A Bayesian Approach Based on Mixture Models for Clustering and Classification of Heterogeneous Biological Data

    Directory of Open Access Journals (Sweden)

    Arturo Medrano-Soto

    2004-12-01

    Full Text Available Based on mixture models, we present a Bayesian method (called BClass to classify biological entities (e.g. genes when variables of quite heterogeneous nature are analyzed. Various statistical distributions are used to model the continuous/categorical data commonly produced by genetic experiments and large-scale genomic projects. We calculate the posterior probability of each entry to belong to each element (group in the mixture. In this way, an original set of heterogeneous variables is transformed into a set of purely homogeneous characteristics represented by the probabilities of each entry to belong to the groups. The number of groups in the analysis is controlled dynamically by rendering the groups as 'alive' and 'dormant' depending upon the number of entities classified within them. Using standard Metropolis-Hastings and Gibbs sampling algorithms, we constructed a sampler to approximate posterior moments and grouping probabilities. Since this method does not require the definition of similarity measures, it is especially suitable for data mining and knowledge discovery in biological databases. We applied BClass to classify genes in RegulonDB, a database specialized in information about the transcriptional regulation of gene expression in the bacterium Escherichia coli. The classification obtained is consistent with current knowledge and allowed prediction of missing values for a number of genes. BClass is object-oriented and fully programmed in Lisp-Stat. The output grouping probabilities are analyzed and interpreted using graphical (dynamically linked plots and query-based approaches. We discuss the advantages of using Lisp-Stat as a programming language as well as the problems we faced when the data volume increased exponentially due to the ever-growing number of genomic projects.

  20. Dissection of a Complex Disease Susceptibility Region Using a Bayesian Stochastic Search Approach to Fine Mapping.

    Directory of Open Access Journals (Sweden)

    Chris Wallace

    2015-06-01

    Full Text Available Identification of candidate causal variants in regions associated with risk of common diseases is complicated by linkage disequilibrium (LD and multiple association signals. Nonetheless, accurate maps of these variants are needed, both to fully exploit detailed cell specific chromatin annotation data to highlight disease causal mechanisms and cells, and for design of the functional studies that will ultimately be required to confirm causal mechanisms. We adapted a Bayesian evolutionary stochastic search algorithm to the fine mapping problem, and demonstrated its improved performance over conventional stepwise and regularised regression through simulation studies. We then applied it to fine map the established multiple sclerosis (MS and type 1 diabetes (T1D associations in the IL-2RA (CD25 gene region. For T1D, both stepwise and stochastic search approaches identified four T1D association signals, with the major effect tagged by the single nucleotide polymorphism, rs12722496. In contrast, for MS, the stochastic search found two distinct competing models: a single candidate causal variant, tagged by rs2104286 and reported previously using stepwise analysis; and a more complex model with two association signals, one of which was tagged by the major T1D associated rs12722496 and the other by rs56382813. There is low to moderate LD between rs2104286 and both rs12722496 and rs56382813 (r2 ≃ 0:3 and our two SNP model could not be recovered through a forward stepwise search after conditioning on rs2104286. Both signals in the two variant model for MS affect CD25 expression on distinct subpopulations of CD4+ T cells, which are key cells in the autoimmune process. The results support a shared causal variant for T1D and MS. Our study illustrates the benefit of using a purposely designed model search strategy for fine mapping and the advantage of combining disease and protein expression data.

  1. Understanding uncertainty in temperature effects on vector-borne disease: a Bayesian approach

    Science.gov (United States)

    Johnson, Leah R.; Ben-Horin, Tal; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.

    2015-01-01

    Extrinsic environmental factors influence the distribution and population dynamics of many organisms, including insects that are of concern for human health and agriculture. This is particularly true for vector-borne infectious diseases like malaria, which is a major source of morbidity and mortality in humans. Understanding the mechanistic links between environment and population processes for these diseases is key to predicting the consequences of climate change on transmission and for developing effective interventions. An important measure of the intensity of disease transmission is the reproductive number R0. However, understanding the mechanisms linking R0 and temperature, an environmental factor driving disease risk, can be challenging because the data available for parameterization are often poor. To address this, we show how a Bayesian approach can help identify critical uncertainties in components of R0 and how this uncertainty is propagated into the estimate of R0. Most notably, we find that different parameters dominate the uncertainty at different temperature regimes: bite rate from 15°C to 25°C; fecundity across all temperatures, but especially ~25–32°C; mortality from 20°C to 30°C; parasite development rate at ~15–16°C and again at ~33–35°C. Focusing empirical studies on these parameters and corresponding temperature ranges would be the most efficient way to improve estimates of R0. While we focus on malaria, our methods apply to improving process-based models more generally, including epidemiological, physiological niche, and species distribution models.

  2. An unbiased Bayesian approach to functional connectomics implicates social-communication networks in autism.

    Science.gov (United States)

    Venkataraman, Archana; Duncan, James S; Yang, Daniel Y-J; Pelphrey, Kevin A

    2015-01-01

    Resting-state functional magnetic resonance imaging (rsfMRI) studies reveal a complex pattern of hyper- and hypo-connectivity in children with autism spectrum disorder (ASD). Whereas rsfMRI findings tend to implicate the default mode network and subcortical areas in ASD, task fMRI and behavioral experiments point to social dysfunction as a unifying impairment of the disorder. Here, we leverage a novel Bayesian framework for whole-brain functional connectomics that aggregates population differences in connectivity to localize a subset of foci that are most affected by ASD. Our approach is entirely data-driven and does not impose spatial constraints on the region foci or dictate the trajectory of altered functional pathways. We apply our method to data from the openly shared Autism Brain Imaging Data Exchange (ABIDE) and pinpoint two intrinsic functional networks that distinguish ASD patients from typically developing controls. One network involves foci in the right temporal pole, left posterior cingulate cortex, left supramarginal gyrus, and left middle temporal gyrus. Automated decoding of this network by the Neurosynth meta-analytic database suggests high-level concepts of "language" and "comprehension" as the likely functional correlates. The second network consists of the left banks of the superior temporal sulcus, right posterior superior temporal sulcus extending into temporo-parietal junction, and right middle temporal gyrus. Associated functionality of these regions includes "social" and "person". The abnormal pathways emanating from the above foci indicate that ASD patients simultaneously exhibit reduced long-range or inter-hemispheric connectivity and increased short-range or intra-hemispheric connectivity. Our findings reveal new insights into ASD and highlight possible neural mechanisms of the disorder.

  3. The relevance sample-feature machine: a sparse Bayesian learning approach to joint feature-sample selection.

    Science.gov (United States)

    Mohsenzadeh, Yalda; Sheikhzadeh, Hamid; Reza, Ali M; Bathaee, Najmehsadat; Kalayeh, Mahdi M

    2013-12-01

    This paper introduces a novel sparse Bayesian machine-learning algorithm for embedded feature selection in classification tasks. Our proposed algorithm, called the relevance sample feature machine (RSFM), is able to simultaneously choose the relevance samples and also the relevance features for regression or classification problems. We propose a separable model in feature and sample domains. Adopting a Bayesian approach and using Gaussian priors, the learned model by RSFM is sparse in both sample and feature domains. The proposed algorithm is an extension of the standard RVM algorithm, which only opts for sparsity in the sample domain. Experimental comparisons on synthetic as well as benchmark data sets show that RSFM is successful in both feature selection (eliminating the irrelevant features) and accurate classification. The main advantages of our proposed algorithm are: less system complexity, better generalization and avoiding overfitting, and less computational cost during the testing stage.

  4. A Bayesian network driven approach to model the transcriptional response to nitric oxide in Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Jingchun Zhu

    Full Text Available The transcriptional response to exogenously supplied nitric oxide in Saccharomyces cerevisiae was modeled using an integrated framework of Bayesian network learning and experimental feedback. A Bayesian network learning algorithm was used to generate network models of transcriptional output, followed by model verification and revision through experimentation. Using this framework, we generated a network model of the yeast transcriptional response to nitric oxide and a panel of other environmental signals. We discovered two environmental triggers, the diauxic shift and glucose repression, that affected the observed transcriptional profile. The computational method predicted the transcriptional control of yeast flavohemoglobin YHB1 by glucose repression, which was subsequently experimentally verified. A freely available software application, ExpressionNet, was developed to derive Bayesian network models from a combination of gene expression profile clusters, genetic information and experimental conditions.

  5. BAYESIAN APPROACH TO THE PROCESS OF IDENTIFICATION OF THE DETERMINANTS OF INNOVATIVENESS

    Directory of Open Access Journals (Sweden)

    Marta Czyżewska

    2014-08-01

    Full Text Available Bayesian belief networks are applied in determining the most important factors of the innovativeness level of national economies. The paper is divided into two parts. The first presentsthe basic theory of Bayesian networks whereas in the second, the belief networks have been generated by an inhouse developed computer system called BeliefSEEKER which was implemented to generate the determinants influencing the innovativeness level of national economies.Qualitative analysis of the generated belief networks provided a way to define a set of the most important dimensions influencing the innovativeness level of economies and then the indicators that form these dimensions. It has been proven that Bayesian networks are very effective methods for multidimensional analysis and forming conclusions and recommendations regarding the strength of each innovative determinant influencing the overall performance of a country’s economy.

  6. Discriminative Bayesian Dictionary Learning for Classification.

    Science.gov (United States)

    Akhtar, Naveed; Shafait, Faisal; Mian, Ajmal

    2016-12-01

    We propose a Bayesian approach to learn discriminative dictionaries for sparse representation of data. The proposed approach infers probability distributions over the atoms of a discriminative dictionary using a finite approximation of Beta Process. It also computes sets of Bernoulli distributions that associate class labels to the learned dictionary atoms. This association signifies the selection probabilities of the dictionary atoms in the expansion of class-specific data. Furthermore, the non-parametric character of the proposed approach allows it to infer the correct size of the dictionary. We exploit the aforementioned Bernoulli distributions in separately learning a linear classifier. The classifier uses the same hierarchical Bayesian model as the dictionary, which we present along the analytical inference solution for Gibbs sampling. For classification, a test instance is first sparsely encoded over the learned dictionary and the codes are fed to the classifier. We performed experiments for face and action recognition; and object and scene-category classification using five public datasets and compared the results with state-of-the-art discriminative sparse representation approaches. Experiments show that the proposed Bayesian approach consistently outperforms the existing approaches.

  7. A Bayesian approach to identifying structural nonlinearity using free-decay response: Application to damage detection in composites

    Science.gov (United States)

    Nichols, J.M.; Link, W.A.; Murphy, K.D.; Olson, C.C.

    2010-01-01

    This work discusses a Bayesian approach to approximating the distribution of parameters governing nonlinear structural systems. Specifically, we use a Markov Chain Monte Carlo method for sampling the posterior parameter distributions thus producing both point and interval estimates for parameters. The method is first used to identify both linear and nonlinear parameters in a multiple degree-of-freedom structural systems using free-decay vibrations. The approach is then applied to the problem of identifying the location, size, and depth of delamination in a model composite beam. The influence of additive Gaussian noise on the response data is explored with respect to the quality of the resulting parameter estimates.

  8. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  9. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  10. Understanding the Uncertainty of an Effectiveness-Cost Ratio in Educational Resource Allocation: A Bayesian Approach

    Science.gov (United States)

    Pan, Yilin

    2016-01-01

    Given the necessity to bridge the gap between what happened and what is likely to happen, this paper aims to explore how to apply Bayesian inference to cost-effectiveness analysis so as to capture the uncertainty of a ratio-type efficiency measure. The first part of the paper summarizes the characteristics of the evaluation data that are commonly…

  11. A Bayesian network approach for causal inferences in pesticide risk assessment and management

    Science.gov (United States)

    Pesticide risk assessment and management must balance societal benefits and ecosystem protection, based on quantified risks and the strength of the causal linkages between uses of the pesticide and socioeconomic and ecological endpoints of concern. A Bayesian network (BN) is a gr...

  12. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... complex prior representation achieve improved sparsity representations in low signalto- noise ratio as opposed to state-of-the-art sparse estimators. This result is of particular importance for the applicability of the algorithms in the field of channel estimation. We then derive various iterative...

  13. Extracting a Whisper from the DIN: A Bayesian-Inductive Approach to Learning an Anticipatory Model of Cavitation

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, S.W.

    1999-11-07

    For several reasons, Bayesian parameter estimation is superior to other methods for inductively learning a model for an anticipatory system. Since it exploits prior knowledge, the analysis begins from a more advantageous starting point than other methods. Also, since "nuisance parameters" can be removed from the Bayesian analysis, the description of the model need not be as complete as is necessary for such methods as matched filtering. In the limit of perfectly random noise and a perfect description of the model, the signal-to-noise ratio improves as the square root of the number of samples in the data. Even with the imperfections of real-world data, Bayesian methods approach this ideal limit of performance more closely than other methods. These capabilities provide a strategy for addressing a major unsolved problem in pump operation: the identification of precursors of cavitation. Cavitation causes immediate degradation of pump performance and ultimate destruction of the pump. However, the most efficient point to operate a pump is just below the threshold of cavitation. It might be hoped that a straightforward method to minimize pump cavitation damage would be to simply adjust the operating point until the inception of cavitation is detected and then to slightly readjust the operating point to let the cavitation vanish. However, due to the continuously evolving state of the fluid moving through the pump, the threshold of cavitation tends to wander. What is needed is to anticipate cavitation, and this requires the detection and identification of precursor features that occur just before cavitation starts.

  14. Using auxiliary information to improve wildlife disease surveillance when infected animals are not detected: a Bayesian approach

    Science.gov (United States)

    Heisey, Dennis M.; Jennelle, Christopher S.; Russell, Robin E.; Walsh, Daniel P.

    2014-01-01

    There are numerous situations in which it is important to determine whether a particular disease of interest is present in a free-ranging wildlife population. However adequate disease surveillance can be labor-intensive and expensive and thus there is substantial motivation to conduct it as efficiently as possible. Surveillance is often based on the assumption of a simple random sample, but this can almost always be improved upon if there is auxiliary information available about disease risk factors. We present a Bayesian approach to disease surveillance when auxiliary risk information is available which will usually allow for substantial improvements over simple random sampling. Others have employed risk weights in surveillance, but this can result in overly optimistic statements regarding freedom from disease due to not accounting for the uncertainty in the auxiliary information; our approach remedies this. We compare our Bayesian approach to a published example of risk weights applied to chronic wasting disease in deer in Colorado, and we also present calculations to examine when uncertainty in the auxiliary information has a serious impact on the risk weights approach. Our approach allows “apples-to-apples” comparisons of surveillance efficiencies between units where heterogeneous samples were collected

  15. A Bayesian Approach to Identifying New Risk Factors for Dementia: A Nationwide Population-Based Study.

    Science.gov (United States)

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-05-01

    Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics.We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database.We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia.We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients.

  16. A hierarchical approach for online temporal lobe seizure detection in long-term intracranial EEG recordings

    Science.gov (United States)

    Liang, Sheng-Fu; Chen, Yi-Chun; Wang, Yu-Lin; Chen, Pin-Tzu; Yang, Chia-Hsiang; Chiueh, Herming

    2013-08-01

    Objective. Around 1% of the world's population is affected by epilepsy, and nearly 25% of patients cannot be treated effectively by available therapies. The presence of closed-loop seizure-triggered stimulation provides a promising solution for these patients. Realization of fast, accurate, and energy-efficient seizure detection is the key to such implants. In this study, we propose a two-stage on-line seizure detection algorithm with low-energy consumption for temporal lobe epilepsy (TLE). Approach. Multi-channel signals are processed through independent component analysis and the most representative independent component (IC) is automatically selected to eliminate artifacts. Seizure-like intracranial electroencephalogram (iEEG) segments are fast detected in the first stage of the proposed method and these seizures are confirmed in the second stage. The conditional activation of the second-stage signal processing reduces the computational effort, and hence energy, since most of the non-seizure events are filtered out in the first stage. Main results. Long-term iEEG recordings of 11 patients who suffered from TLE were analyzed via leave-one-out cross validation. The proposed method has a detection accuracy of 95.24%, a false alarm rate of 0.09/h, and an average detection delay time of 9.2 s. For the six patients with mesial TLE, a detection accuracy of 100.0%, a false alarm rate of 0.06/h, and an average detection delay time of 4.8 s can be achieved. The hierarchical approach provides a 90% energy reduction, yielding effective and energy-efficient implementation for real-time epileptic seizure detection. Significance. An on-line seizure detection method that can be applied to monitor continuous iEEG signals of patients who suffered from TLE was developed. An IC selection strategy to automatically determine the most seizure-related IC for seizure detection was also proposed. The system has advantages of (1) high detection accuracy, (2) low false alarm, (3) short

  17. Coastal vulnerability assessment using Fuzzy Logic and Bayesian Belief Network approaches

    Science.gov (United States)

    Valentini, Emiliana; Nguyen Xuan, Alessandra; Filipponi, Federico; Taramelli, Andrea

    2017-04-01

    Natural hazards such as sea surge are threatening low-lying coastal plains. In order to deal with disturbances a deeper understanding of benefits deriving from ecosystem services assessment, management and planning can contribute to enhance the resilience of coastal systems. In this frame assessing current and future vulnerability is a key concern of many Systems Of Systems SOS (social, ecological, institutional) that deals with several challenges like the definition of Essential Variables (EVs) able to synthesize the required information, the assignment of different weight to be attributed to each considered variable, the selection of method for combining the relevant variables. It is widely recognized that ecosystems contribute to human wellbeing and then their conservation increases the resilience capacities and could play a key role in reducing climate related risk and thus physical and economic losses. A way to fully exploit ecosystems potential, i.e. their so called ecopotential (see H2020 EU funded project "ECOPOTENTIAL"), is the Ecosystem based Adaptation (EbA): the use of ecosystem services as part of an adaptation strategy. In order to provide insight in understanding regulating ecosystem services to surge and which variables influence them and to make the best use of available data and information (EO products, in situ data and modelling), we propose a multi-component surge vulnerability assessment, focusing on coastal sandy dunes as natural barriers. The aim is to combine together eco-geomorphological and socio-economic variables with the hazard component on the base of different approaches: 1) Fuzzy Logic; 2) Bayesian Belief Networks (BBN). The Fuzzy Logic approach is very useful to get a spatialized information and it can easily combine variables coming from different sources. It provides information on vulnerability moving along-shore and across-shore (beach-dune transect), highlighting the variability of vulnerability conditions in the spatial

  18. Using more than the oldest fossils: dating osmundaceae with three Bayesian clock approaches.

    Science.gov (United States)

    Grimm, Guido W; Kapli, Paschalia; Bomfleur, Benjamin; McLoughlin, Stephen; Renner, Susanne S

    2015-05-01

    A major concern in molecular clock dating is how to use information from the fossil record to calibrate genetic distances from DNA sequences. Here we apply three Bayesian dating methods that differ in how calibration is achieved-"node dating" (ND) in BEAST, "total evidence" (TE) dating in MrBayes, and the "fossilized birth-death" (FBD) in FDPPDiv-to infer divergence times in the royal ferns. Osmundaceae have 16-17 species in four genera, two mainly in the Northern Hemisphere and two in South Africa and Australasia; they are the sister clade to the remaining leptosporangiate ferns. Their fossil record consists of at least 150 species in ∼17 genera. For ND, we used the five oldest fossils, whereas for TE and FBD dating, which do not require forcing fossils to nodes and thus can use more fossils, we included up to 36 rhizomes and frond compression/impression fossils, which for TE dating were scored for 33 morphological characters. We also subsampled 10%, 25%, and 50% of the 36 fossils to assess model sensitivity. FBD-derived divergence ages were generally greater than those inferred from ND; two of seven TE-derived ages agreed with FBD-obtained ages, the others were much younger or much older than ND or FBD ages. We prefer the FBD-derived ages because they best fit the Osmundales fossil record (including Triassic fossils not used in our study). Under the preferred model, the clade encompassing extant Osmundaceae (and many fossils) dates to the latest Paleozoic to Early Triassic; divergences of the extant species occurred during the Neogene. Under the assumption of constant speciation and extinction rates, the FBD approach yielded speciation and extinction rates that overlapped those obtained from just neontological data. However, FBD estimates of speciation and extinction are sensitive to violations in the assumption of continuous fossil sampling; therefore, these estimates should be treated with caution. © The Author(s) 2014. Published by Oxford University Press

  19. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cai, C. [CEA, LIST, 91191 Gif-sur-Yvette, France and CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Rodet, T.; Mohammad-Djafari, A. [CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Legoupil, S. [CEA, LIST, 91191 Gif-sur-Yvette (France)

    2013-11-15

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  20. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography.

    Science.gov (United States)

    Cai, C; Rodet, T; Legoupil, S; Mohammad-Djafari, A

    2013-11-01

    Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images. This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed. The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also necessary to have the

  1. Hierarchical Control of Droop-Controlled DC and AC Microgrids - A General Approach Towards Standardization

    DEFF Research Database (Denmark)

    Guerrero, Josep M.; Vásquez, Juan V.; Teodorescu, Remus

    2009-01-01

    DC and AC Microgrids are key elements to integrate renewable and distributed energy resources as well as distributed energy storage systems. In the last years, efforts toward the standardization of these Microgrids have been made. In this sense, this paper present the hierarchical control derived...

  2. A Hierarchical Approach to Real-time Activity Recognition in Body Sensor Networks

    DEFF Research Database (Denmark)

    Wang, Liang; Gu, Tao; Tao, Xianping

    2012-01-01

    Real-time activity recognition in body sensor networks is an important and challenging task. In this paper, we propose a real-time, hierarchical model to recognize both simple gestures and complex activities using a wireless body sensor network. In this model, we rst use a fast and lightweight al...

  3. Detecting Hierarchical Structure in Networks

    DEFF Research Database (Denmark)

    Herlau, Tue; Mørup, Morten; Schmidt, Mikkel Nørgaard

    2012-01-01

    a generative Bayesian model that is able to infer whether hierarchies are present or not from a hypothesis space encompassing all types of hierarchical tree structures. For efficient inference we propose a collapsed Gibbs sampling procedure that jointly infers a partition and its hierarchical structure......Many real-world networks exhibit hierarchical organization. Previous models of hierarchies within relational data has focused on binary trees; however, for many networks it is unknown whether there is hierarchical structure, and if there is, a binary tree might not account well for it. We propose....... On synthetic and real data we demonstrate that our model can detect hierarchical structure leading to better link-prediction than competing models. Our model can be used to detect if a network exhibits hierarchical structure, thereby leading to a better comprehension and statistical account the network....

  4. Hierarchical Satellite-based Approach to Global Monitoring of Crop Condition and Food Production

    Science.gov (United States)

    Zheng, Y.; Wu, B.; Gommes, R.; Zhang, M.; Zhang, N.; Zeng, H.; Zou, W.; Yan, N.

    2014-12-01

    The assessment of global food security goes beyond the mere estimate of crop production: It needs to take into account the spatial and temporal patterns of food availability, as well as physical and economic access. Accurate and timely information is essential to both food producers and consumers. Taking advantage of multiple new remote sensing data sources, especially from Chinese satellites, such as FY-2/3A, HJ-1 CCD, CropWatch has expanded the scope of its international analyses through the development of new indicators and an upgraded operational methodology. The new monitoring approach adopts a hierarchical system covering four spatial levels of detail: global (sixty-five Monitoring and Reporting Units, MRU), seven major production zones (MPZ), thirty-one key countries (including China) and "sub- countries." The thirty-one countries encompass more that 80% of both global exports and production of four major crops (maize, rice, soybean and wheat). The methodology resorts to climatic and remote sensing indicators at different scales, using the integrated information to assess global, regional, and national (as well as sub-national) crop environmental condition, crop condition, drought, production, and agricultural trends. The climatic indicators for rainfall, temperature, photosynthetically active radiation (PAR) as well as potential biomass are first analysed at global scale to describe overall crop growing conditions. At MPZ scale, the key indicators pay more attention to crops and include Vegetation health index (VHI), Vegetation condition index (VCI), Cropped arable land fraction (CALF) as well as Cropping intensity (CI). Together, they characterise agricultural patterns, farming intensity and stress. CropWatch carries out detailed crop condition analyses for thirty one individual countries at the national scale with a comprehensive array of variables and indicators. The Normalized difference vegetation index (NDVI), cropped areas and crop condition are

  5. Bayesian Group Bridge for Bi-level Variable Selection.

    Science.gov (United States)

    Mallick, Himel; Yi, Nengjun

    2017-06-01

    A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.

  6. An analysis on operational risk in international banking: A Bayesian approach (2007–2011

    Directory of Open Access Journals (Sweden)

    José Francisco Martínez-Sánchez

    2016-07-01

    Full Text Available This study aims to develop a Bayesian methodology to identify, quantify and measure operational risk in several business lines of commercial banking. To do this, a Bayesian network (BN model is designed with prior and subsequent distributions to estimate the frequency and severity. Regarding the subsequent distributions, an inference procedure for the maximum expected loss, for a period of 20 days, is carried out by using the Monte Carlo simulation method. The business lines analyzed are marketing and sales, retail banking and private banking, which all together accounted for 88.5% of the losses in 2011. Data was obtained for the period 2007–2011 from the Riskdata Operational Exchange Association (ORX, and external data was provided from qualified experts to complete the missing records or to improve its poor quality.

  7. Fast and accurate Bayesian model criticism and conflict diagnostics using R-INLA

    KAUST Repository

    Ferkingstad, Egil

    2017-10-16

    Bayesian hierarchical models are increasingly popular for realistic modelling and analysis of complex data. This trend is accompanied by the need for flexible, general and computationally efficient methods for model criticism and conflict detection. Usually, a Bayesian hierarchical model incorporates a grouping of the individual data points, as, for example, with individuals in repeated measurement data. In such cases, the following question arises: Are any of the groups “outliers,” or in conflict with the remaining groups? Existing general approaches aiming to answer such questions tend to be extremely computationally demanding when model fitting is based on Markov chain Monte Carlo. We show how group-level model criticism and conflict detection can be carried out quickly and accurately through integrated nested Laplace approximations (INLA). The new method is implemented as a part of the open-source R-INLA package for Bayesian computing (http://r-inla.org).

  8. Empirical vs Bayesian approach for estimating haplotypes from genotypes of unrelated individuals

    Directory of Open Access Journals (Sweden)

    Cheng Jacob

    2007-01-01

    Full Text Available Abstract Background The completion of the HapMap project has stimulated further development of haplotype-based methodologies for disease associations. A key aspect of such development is the statistical inference of individual diplotypes from unphased genotypes. Several methodologies for inferring haplotypes have been developed, but they have not been evaluated extensively to determine which method not only performs well, but also can be easily incorporated in downstream haplotype-based association analyses. In this paper, we attempt to do so. Our evaluation was carried out by comparing the two leading Bayesian methods, implemented in PHASE and HAPLOTYPER, and the two leading empirical methods, implemented in PL-EM and HPlus. We used these methods to analyze real data, namely the dense genotypes on X-chromosome of 30 European and 30 African trios provided by the International HapMap Project, and simulated genotype data. Our conclusions are based on these analyses. Results All programs performed very well on X-chromosome data, with an average similarity index of 0.99 and an average prediction rate of 0.99 for both European and African trios. On simulated data with approximation of coalescence, PHASE implementing the Bayesian method based on the coalescence approximation outperformed other programs on small sample sizes. When the sample size increased, other programs performed as well as PHASE. PL-EM and HPlus implementing empirical methods required much less running time than the programs implementing the Bayesian methods. They required only one hundredth or thousandth of the running time required by PHASE, particularly when analyzing large sample sizes and large umber of SNPs. Conclusion For large sample sizes (hundreds or more, which most association studies require, the two empirical methods might be used since they infer the haplotypes as accurately as any Bayesian methods and can be incorporated easily into downstream haplotype

  9. Empirical vs Bayesian approach for estimating haplotypes from genotypes of unrelated individuals

    Science.gov (United States)

    Li, Shuying Sue; Cheng, Jacob Jen-Hao; Zhao, Lue Ping

    2007-01-01

    Background The completion of the HapMap project has stimulated further development of haplotype-based methodologies for disease associations. A key aspect of such development is the statistical inference of individual diplotypes from unphased genotypes. Several methodologies for inferring haplotypes have been developed, but they have not been evaluated extensively to determine which method not only performs well, but also can be easily incorporated in downstream haplotype-based association analyses. In this paper, we attempt to do so. Our evaluation was carried out by comparing the two leading Bayesian methods, implemented in PHASE and HAPLOTYPER, and the two leading empirical methods, implemented in PL-EM and HPlus. We used these methods to analyze real data, namely the dense genotypes on X-chromosome of 30 European and 30 African trios provided by the International HapMap Project, and simulated genotype data. Our conclusions are based on these analyses. Results All programs performed very well on X-chromosome data, with an average similarity index of 0.99 and an average prediction rate of 0.99 for both European and African trios. On simulated data with approximation of coalescence, PHASE implementing the Bayesian method based on the coalescence approximation outperformed other programs on small sample sizes. When the sample size increased, other programs performed as well as PHASE. PL-EM and HPlus implementing empirical methods required much less running time than the programs implementing the Bayesian methods. They required only one hundredth or thousandth of the running time required by PHASE, particularly when analyzing large sample sizes and large umber of SNPs. Conclusion For large sample sizes (hundreds or more), which most association studies require, the two empirical methods might be used since they infer the haplotypes as accurately as any Bayesian methods and can be incorporated easily into downstream haplotype-based analyses such as haplotype

  10. Substantial advantage of a combined Bayesian and genotyping approach in testosterone doping tests.

    Science.gov (United States)

    Schulze, Jenny Jakobsson; Lundmark, Jonas; Garle, Mats; Ekström, Lena; Sottas, Pierre-Edouard; Rane, Anders

    2009-03-01

    Testosterone abuse is conventionally assessed by the urinary testosterone/epitestosterone (T/E) ratio, levels above 4.0 being considered suspicious. A deletion polymorphism in the gene coding for UGT2B17 is strongly associated with reduced testosterone glucuronide (TG) levels in urine. Many of the individuals devoid of the gene would not reach a T/E ratio of 4.0 after testosterone intake. Future test programs will most likely shift from population based- to individual-based T/E cut-off ratios using Bayesian inference. A longitudinal analysis is dependent on an individual's true negative baseline T/E ratio. The aim was to investigate whether it is possible to increase the sensitivity and specificity of the T/E test by addition of UGT2B17 genotype information in a Bayesian framework. A single intramuscular dose of 500mg testosterone enanthate was given to 55 healthy male volunteers with either two, one or no allele (ins/ins, ins/del or del/del) of the UGT2B17 gene. Urinary excretion of TG and the T/E ratio was measured during 15 days. The Bayesian analysis was conducted to calculate the individual T/E cut-off ratio. When adding the genotype information, the program returned lower individual cut-off ratios in all del/del subjects increasing the sensitivity of the test considerably. It will be difficult, if not impossible, to discriminate between a true negative baseline T/E value and a false negative one without knowledge of the UGT2B17 genotype. UGT2B17 genotype information is crucial, both to decide which initial cut-off ratio to use for an individual, and for increasing the sensitivity of the Bayesian analysis.

  11. Diagnosis of combined faults in Rotary Machinery by Non-Naive Bayesian approach

    Science.gov (United States)

    Asr, Mahsa Yazdanian; Ettefagh, Mir Mohammad; Hassannejad, Reza; Razavi, Seyed Naser

    2017-02-01

    When combined faults happen in different parts of the rotating machines, their features are profoundly dependent. Experts are completely familiar with individuals faults characteristics and enough data are available from single faults but the problem arises, when the faults combined and the separation of characteristics becomes complex. Therefore, the experts cannot declare exact information about the symptoms of combined fault and its quality. In this paper to overcome this drawback, a novel method is proposed. The core idea of the method is about declaring combined fault without using combined fault features as training data set and just individual fault features are applied in training step. For this purpose, after data acquisition and resampling the obtained vibration signals, Empirical Mode Decomposition (EMD) is utilized to decompose multi component signals to Intrinsic Mode Functions (IMFs). With the use of correlation coefficient, proper IMFs for feature extraction are selected. In feature extraction step, Shannon energy entropy of IMFs was extracted as well as statistical features. It is obvious that most of extracted features are strongly dependent. To consider this matter, Non-Naive Bayesian Classifier (NNBC) is appointed, which release the fundamental assumption of Naive Bayesian, i.e., the independence among features. To demonstrate the superiority of NNBC, other counterpart methods, include Normal Naive Bayesian classifier, Kernel Naive Bayesian classifier and Back Propagation Neural Networks were applied and the classification results are compared. An experimental vibration signals, collected from automobile gearbox, were used to verify the effectiveness of the proposed method. During the classification process, only the features, related individually to healthy state, bearing failure and gear failures, were assigned for training the classifier. But, combined fault features (combined gear and bearing failures) were examined as test data. The achieved

  12. Inferring Gene Regulatory Networks in the Arabidopsis Root Using a Dynamic Bayesian Network Approach.

    Science.gov (United States)

    de Luis Balaguer, Maria Angels; Sozzani, Rosangela

    2017-01-01

    Gene regulatory network (GRN) models have been shown to predict and represent interactions among sets of genes. Here, we first show the basic steps to implement a simple but computationally efficient algorithm to infer GRNs based on dynamic Bayesian networks (DBNs), and we then explain how to approximate DBN-based GRN models with continuous models. In addition, we show a MATLAB implementation of the key steps of this method, which we use to infer an Arabidopsis root GRN.

  13. Divining the Level of Corruption. A Bayesian State-Space Approach.

    OpenAIRE

    S. Standaert

    2013-01-01

    This paper outlines a new methodological framework for combining indicators of corruption. The methodology of the World Governance Indicators is extended to fully make use of the time-structure present in corruption data. The resulting state-space framework is estimated using a Bayesian Gibbs sampler algorithm. The state-space framework holds many advantages from a practical, an estimation and a theoretical point of view. Most importantly, the indicator significantly increases data availabili...

  14. Predicting drug safety and communicating risk: benefits of a Bayesian approach.

    Science.gov (United States)

    Lazic, Stanley E; Edmunds, Nicholas; Pollard, Christopher E

    2017-11-06

    Drug toxicity is a major source of attrition in drug discovery and development. Pharmaceutical companies routinely use preclinical data to predict clinical outcomes and continue to invest in new assays to improve predictions. However, there are many open questions about how to make the best use of available data, combine diverse data, quantify risk, and communicate risk and uncertainty to enable good decisions. The costs of suboptimal decisions are clear: resources are wasted and patients may be put at risk. We argue that Bayesian methods provide answers to all of these problems and use hERG-mediated QT prolongation as a case study. Benefits of Bayesian machine learning models include intuitive probabilistic statements of risk that incorporate all sources of uncertainty, the option to include diverse data and external information, and visualisations that have a clear link between the output from a statistical model and what this means for risk. Furthermore, Bayesian methods are easy to use with modern software, making their adoption for safety screening straightforward. We include R and Python code to encourage the adoption of these methods. © The Author 2017. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  15. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  16. A computationally efficient Bayesian sequential simulation approach for the assimilation of vast and diverse hydrogeophysical datasets

    Science.gov (United States)

    Nussbaumer, Raphaël; Gloaguen, Erwan; Mariéthoz, Grégoire; Holliger, Klaus

    2016-04-01

    Bayesian sequential simulation (BSS) is a powerful geostatistical technique, which notably has shown significant potential for the assimilation of datasets that are diverse with regard to the spatial resolution and their relationship. However, these types of applications of BSS require a large number of realizations to adequately explore the solution space and to assess the corresponding uncertainties. Moreover, such simulations generally need to be performed on very fine grids in order to adequately exploit the technique's potential for characterizing heterogeneous environments. Correspondingly, the computational cost of BSS algorithms in their classical form is very high, which so far has limited an effective application of this method to large models and/or vast datasets. In this context, it is also important to note that the inherent assumption regarding the independence of the considered datasets is generally regarded as being too strong in the context of sequential simulation. To alleviate these problems, we have revisited the classical implementation of BSS and incorporated two key features to increase the computational efficiency. The first feature is a combined quadrant spiral - superblock search, which targets run-time savings on large grids and adds flexibility with regard to the selection of neighboring points using equal directional sampling and treating hard data and previously simulated points separately. The second feature is a constant path of simulation, which enhances the efficiency for multiple realizations. We have also modified the aggregation operator to be more flexible with regard to the assumption of independence of the considered datasets. This is achieved through log-linear pooling, which essentially allows for attributing weights to the various data components. Finally, a multi-grid simulating path was created to enforce large-scale variance and to allow for adapting parameters, such as, for example, the log-linear weights or the type

  17. Calibration of environmental radionuclide transfer models using a Bayesian approach with Markov chain Monte Carlo simulations and model comparisons - Calibration of radionuclides transfer models in the environment using a Bayesian approach with Markov chain Monte Carlo simulation and comparison of models

    Energy Technology Data Exchange (ETDEWEB)

    Nicoulaud-Gouin, V.; Giacalone, M.; Gonze, M.A. [Institut de Radioprotection et de Surete Nucleaire-PRP-ENV/SERIS/LM2E (France); Martin-Garin, A.; Garcia-Sanchez, L. [IRSN-PRP-ENV/SERIS/L2BT (France)

    2014-07-01

    Calibration of transfer models according to observation data is a challenge, especially if parameters uncertainty is required, and if competing models should be decided between them. Generally two main calibration methods are used: The frequentist approach in which the unknown parameter of interest is supposed fixed and its estimation is based on the data only. In this category, least squared method has many restrictions in nonlinear models and competing models need to be nested in order to be compared. The bayesian inference in which the unknown parameter of interest is supposed random and its estimation is based on the data and on prior information. Compared to frequentist method, it provides probability density functions and therefore pointwise estimation with credible intervals. However, in practical cases, Bayesian inference is a complex problem of numerical integration, which explains its low use in operational modeling including radioecology. This study aims to illustrate the interest and feasibility of Bayesian approach in radioecology particularly in the case of ordinary differential equations with non-constant coefficients models, which cover most radiological risk assessment models, notably those implemented in the Symbiose platform (Gonze et al, 2010). Markov Chain Monte Carlo (MCMC) method (Metropolis et al., 1953) was used because the posterior expectations are intractable integrals. The invariant distribution of the parameters was performed by the metropolis-Hasting algorithm (Hastings, 1970). GNU-MCSim software (Bois and Maszle, 2011) a bayesian hierarchical framework, was used to deal with nonlinear differential models. Two case studies including this type of model were investigated: An Equilibrium Kinetic sorption model (EK) (e.g. van Genuchten et al, 1974), with experimental data concerning {sup 137}Cs and {sup 85}Sr sorption and desorption in different soils studied in stirred flow-through reactors. This model, generalizing the K{sub d} approach

  18. Hierarchical demographic approaches for assessing invasion dynamics of non-indigenous species: An example using northern snakehead (Channa argus)

    Science.gov (United States)

    Jiao, Y.; Lapointe, N.W.R.; Angermeier, P.L.; Murphy, B.R.

    2009-01-01

    Models of species' demographic features are commonly used to understand population dynamics and inform management tactics. Hierarchical demographic models are ideal for the assessment of non-indigenous species because our knowledge of non-indigenous populations is usually limited, data on demographic traits often come from a species' native range, these traits vary among populations, and traits are likely to vary considerably over time as species adapt to new environments. Hierarchical models readily incorporate this spatiotemporal variation in species' demographic traits by representing demographic parameters as multi-level hierarchies. As is done for traditional non-hierarchical matrix models, sensitivity and elasticity analyses are used to evaluate the contributions of different life stages and parameters to estimates of population growth rate. We applied a hierarchical model to northern snakehead (Channa argus), a fish currently invading the eastern United States. We used a Monte Carlo approach to simulate uncertainties in the sensitivity and elasticity analyses and to project future population persistence under selected management tactics. We gathered key biological information on northern snakehead natural mortality, maturity and recruitment in its native Asian environment. We compared the model performance with and without hierarchy of parameters. Our results suggest that ignoring the hierarchy of parameters in demographic models may result in poor estimates of population size and growth and may lead to erroneous management advice. In our case, the hierarchy used multi-level distributions to simulate the heterogeneity of demographic parameters across different locations or situations. The probability that the northern snakehead population will increase and harm the native fauna is considerable. Our elasticity and prognostic analyses showed that intensive control efforts immediately prior to spawning and/or juvenile-dispersal periods would be more effective

  19. Bayesian approach to the assessment of the population-specific risk of inhibitors in hemophilia A patients: a case study

    Directory of Open Access Journals (Sweden)

    Cheng J

    2016-10-01

    significant inhibitor (10/100, 5/100 [high rates], and 1/86 [the Food and Drug Administration mandated cutoff rate in PTPs] were calculated. The effect of discounting prior information or scaling up the study data was evaluated.Results: Results based on noninformative priors were similar to the classical approach. Using priors from PTPs lowered the point estimate and narrowed the 95% credible intervals (Case 1: from 1.3 [0.5, 2.7] to 0.8 [0.5, 1.1]; Case 2: from 1.9 [0.6, 6.0] to 0.8 [0.5, 1.1]; Case 3: 2.3 [0.5, 6.8] to 0.7 [0.5, 1.1]. All probabilities of satisfying a threshold of 1/86 were above 0.65. Increasing the number of patients by two and ten times substantially narrowed the credible intervals for the single cohort study (1.4 [0.7, 2.3] and 1.4 [1.1, 1.8], respectively. Increasing the number of studies by two and ten times for the multiple study scenarios (Case 2: 1.9 [0.6, 4.0] and 1.9 [1.5, 2.6]; Case 3: 2.4 [0.9, 5.0] and 2.6 [1.9, 3.5], respectively had a similar effect.Conclusion: Bayesian approach as a robust, transparent, and reproducible analytic method can be efficiently used to estimate the inhibitor rate of hemophilia A in complex clinical settings. Keywords: inhibitor rate, meta-analysis, multicentric study, Bayesian, hemophilia A

  20. Understanding Prairie Fen Hydrology - a Hierarchical Multi-Scale Groundwater Modeling Approach

    Science.gov (United States)

    Sampath, P.; Liao, H.; Abbas, H.; Ma, L.; Li, S.

    2012-12-01

    Prairie fens provide critical habitat to more than 50 rare species and significantly contribute to the biodiversity of the upper Great Lakes region. The sustainability of these globally unique ecosystems, however, requires that they be fed by a steady supply of pristine, calcareous groundwater. Understanding the hydrology that supports the existence of such fens is essential in preserving these valuable habitats. This research uses process-based multi-scale groundwater modeling for this purpose. Two fen-sites, MacCready Fen and Ives Road Fen, in Southern Michigan were systematically studied. A hierarchy of nested steady-state models was built for each fen-site to capture the system's dynamics at spatial scales ranging from the regional groundwater-shed to the local fens. The models utilize high-resolution Digital Elevation Models (DEM), National Hydrologic Datasets (NHD), a recently-assembled water-well database, and results from a state-wide groundwater mapping project to represent the complex hydro-geological and stress framework. The modeling system simulates both shallow glacial and deep bedrock aquifers as well as the interaction between surface water and groundwater. Aquifer heterogeneities were explicitly simulated with multi-scale transition probability geo-statistics. A two-way hydraulic head feedback mechanism was set up between the nested models, such that the parent models provided boundary conditions to the child models, and in turn the child models provided local information to the parent models. A hierarchical mass budget analysis was performed to estimate the seepage fluxes at the surface water/groundwater interfaces and to assess the relative importance of the processes at multiple scales that contribute water to the fens. The models were calibrated using observed base-flows at stream gauging stations and/or static water levels at wells. Three-dimensional particle tracking was used to predict the sources of water to the fens. We observed from the

  1. A Bayesian approach to parameter and reliability estimation in the Poisson distribution.

    Science.gov (United States)

    Canavos, G. C.

    1972-01-01

    For life testing procedures, a Bayesian analysis is developed with respect to a random intensity parameter in the Poisson distribution. Bayes estimators are derived for the Poisson parameter and the reliability function based on uniform and gamma prior distributions of that parameter. A Monte Carlo procedure is implemented to make possible an empirical mean-squared error comparison between Bayes and existing minimum variance unbiased, as well as maximum likelihood, estimators. As expected, the Bayes estimators have mean-squared errors that are appreciably smaller than those of the other two.

  2. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Charles eGreen

    2012-10-01

    Full Text Available Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1 examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2 apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect.Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided samples for the historical dataset (Study 1: N = 64 complete observations and current dataset (Study 2: N = 113 complete observations. Negative binomial regression evaluated Treatment Effectiveness Scores (TES as a function of medication condition (levodopa/carbidopa, placebo, baseline marijuana use (days in past 30, and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES.Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  3. Combining non-precise historical information with instrumental measurements for flood frequency estimation: a fuzzy Bayesian approach

    Science.gov (United States)

    Salinas, Jose Luis; Kiss, Andrea; Viglione, Alberto; Blöschl, Günter

    2016-04-01

    Efforts of the historical environmental extremes community during the last decades have resulted in the obtention of long time series of historical floods, which in some cases range longer than 500 years in the past. In hydrological engineering, historical floods are useful because they give additional information which improves the estimates of discharges with low annual exceedance probabilities, i.e. with high return periods, and additionally might reduce the uncertainty in those estimates. In order to use the historical floods in formal flood frequency analysis, the precise value of the peak discharges would ideally be known, but in most of the cases, the information related to historical floods is given, quantitatively, in a non-precise manner. This work presents an approach on how to deal with the non-precise historical floods, by linking the descriptions in historical records to fuzzy numbers representing discharges. These fuzzy historical discharges are then introduced in a formal Bayesian inference framework, taking into account the arithmetics of non-precise numbers modelled by fuzzy logic theory, to obtain a fuzzy version of the flood frequency curve combining the fuzzy historical flood events and the instrumental data for a given location. Two case studies are selected from the historical literature, representing different facets of the fuzziness present in the historical sources. The results from the cases studies are given in the form of the fuzzy estimates of the flood frequency curves together with the fuzzy 5% and 95% Bayesian credibility bounds for these curves. The presented fuzzy Bayesian inference framework provides a flexible methodology to propagate in an explicit way the imprecision from the historical records into the flood frequency estimate, which allows to assess the effect that the incorporation of non-precise historical information can have in the flood frequency regime.

  4. Uncertainty analysis for effluent trading planning using a Bayesian estimation-based simulation-optimization modeling approach.

    Science.gov (United States)

    Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J

    2017-06-01

    In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic char