WorldWideScience

Sample records for bayesian history reconstruction

  1. Bayesian History Reconstruction of Complex Human Gene Clusters on a Phylogeny

    CERN Document Server

    Vinař, Tomáš; Song, Giltae; Siepel, Adam

    2009-01-01

    Clusters of genes that have evolved by repeated segmental duplication present difficult challenges throughout genomic analysis, from sequence assembly to functional analysis. Improved understanding of these clusters is of utmost importance, since they have been shown to be the source of evolutionary innovation, and have been linked to multiple diseases, including HIV and a variety of cancers. Previously, Zhang et al. (2008) developed an algorithm for reconstructing parsimonious evolutionary histories of such gene clusters, using only human genomic sequence data. In this paper, we propose a probabilistic model for the evolution of gene clusters on a phylogeny, and an MCMC algorithm for reconstruction of duplication histories from genomic sequences in multiple species. Several projects are underway to obtain high quality BAC-based assemblies of duplicated clusters in multiple species, and we anticipate that our method will be useful in analyzing these valuable new data sets.

  2. Bayesian tomographic reconstruction of microsystems

    Science.gov (United States)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-11-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.

  3. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  4. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  5. Photogrammetric Reconstruction with Bayesian Information

    Science.gov (United States)

    Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2016-06-01

    Nowadays photogrammetry and laser scanning methods are the most wide spread surveying techniques. Laser scanning methods usually allow to obtain more accurate results with respect to photogrammetry, but their use have some issues, e.g. related to the high cost of the instrumentation and the typical need of high qualified personnel to acquire experimental data on the field. Differently, photogrammetric reconstruction can be achieved by means of low cost devices and by persons without specific training. Furthermore, the recent diffusion of smart devices (e.g. smartphones) embedded with imaging and positioning sensors (i.e. standard camera, GNSS receiver, inertial measurement unit) is opening the possibility of integrating more information in the photogrammetric reconstruction procedure, in order to increase its computational efficiency, its robustness and accuracy. In accordance with the above observations, this paper examines and validates new possibilities for the integration of information provided by the inertial measurement unit (IMU) into the photogrammetric reconstruction procedure, and, to be more specific, into the procedure for solving the feature matching and the bundle adjustment problems.

  6. Bayesian Cosmic Web Reconstruction: BARCODE for Clusters

    CERN Document Server

    Bos, E G Patrick; Kitaura, Francisco; Cautun, Marius

    2016-01-01

    We describe the Bayesian BARCODE formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the ...

  7. Bayesian Cosmic Web Reconstruction: BARCODE for Clusters

    Science.gov (United States)

    Patrick Bos, E. G.; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius

    2016-10-01

    We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the basis of observational data, without the need for a correction of redshift space artifacts. In this contribution we provide a general overview of the the Cosmic Web connection with clusters and a description of the Bayesian BARCODE formalism. We conclude with a presentation of its successful workings with respect to test runs based on a simulated large scale matter distribution, in physical space as well as in redshift space.

  8. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  9. Photoacoustic image reconstruction based on Bayesian compressive sensing algorithm

    Institute of Scientific and Technical Information of China (English)

    Mingjian Sun; Naizhang Feng; Yi Shen; Jiangang Li; Liyong Ma; Zhenghua Wu

    2011-01-01

    The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain. However, the sparsity of photoacoustic signals is destroyed because noises always exist. Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm. In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic images based on a set of noisy CS measurements. Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.%@@ The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain.However, the sparsity of photoacoustic signals is destroyed because noises always exist.Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm.In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic inages based on a set of noisy CS measurements.Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.

  10. Hierarchical Bayesian sparse image reconstruction with application to MRFM

    CERN Document Server

    Dobigeon, Nicolas; Tourneret, Jean-Yves

    2008-01-01

    This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g. by maximizing the estimated posterior distribution. In our fully Bayesian approach the posteriors of all the parameters are available. Thus our algorithm provides more information than other previously proposed sparse reconstr...

  11. Reconstruction of elongated bubbles fusing the information from multiple optical probes through a Bayesian inference technique

    Science.gov (United States)

    Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr.

    2016-07-01

    In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.

  12. Reconstruction of elongated bubbles fusing the information from multiple optical probes through a Bayesian inference technique.

    Science.gov (United States)

    Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr

    2016-07-01

    In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.

  13. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    Science.gov (United States)

    Barat, Éric; Dautremer, Thomas

    2007-11-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution—normalized emission intensity of the spatial poisson process—is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM.

  14. Bayesian 3D velocity field reconstruction with VIRBIUS

    Science.gov (United States)

    Lavaux, Guilhem

    2016-03-01

    I describe a new Bayesian-based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of e.g. the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIUS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and deep distance surveys for which individual errors on distance would impede velocity field inference.

  15. Bayesian 3d velocity field reconstruction with VIRBIuS

    CERN Document Server

    Lavaux, G

    2015-01-01

    I describe a new Bayesian based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of, e.g., the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIuS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3,000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and d...

  16. Food reconstruction using isotopic transferred signals (FRUITS: a Bayesian model for diet reconstruction.

    Directory of Open Access Journals (Sweden)

    Ricardo Fernandes

    Full Text Available Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals was developed for use in diet reconstruction studies. FRUITS incorporates the capability to account for dietary routing, that is, the contribution of different food fractions (e.g. macronutrients towards a dietary proxy signal measured in the consumer. FRUITS also provides relatively straightforward means for the introduction of prior information on the relative dietary contributions of food groups or food fractions. This type of prior may originate, for instance, from physiological or metabolic studies. FRUITS performance was tested using simulated data and data from a published controlled animal feeding experiment. The feeding experiment data was selected to exemplify the application of the novel capabilities incorporated into FRUITS but also to illustrate some of the aspects that need to be considered when handling data within diet reconstruction studies. FRUITS accurately predicted dietary intakes, and more precise estimates were obtained for dietary scenarios in which expert prior information was included. FRUITS represents a useful tool to achieve accurate and precise food intake estimates in diet reconstruction studies within different scientific fields (e.g. ecology, forensics, archaeology, and dietary physiology.

  17. Reconstructing Fire History: An Exercise in Dendrochronology

    Science.gov (United States)

    Lafon, Charles W.

    2005-01-01

    Dendrochronology is used widely to reconstruct the history of forest disturbances. I created an exercise that introduces the use of dendrochronology to investigate fire history and forest dynamics. The exercise also demonstrates how the dendrochronological technique of crossdating is employed to age dead trees and identify missing rings. I…

  18. A new Bayesian approach to the reconstruction of spectral functions

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function $m(\\omega)$ only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter $\\alpha$ in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of $P[\\rho|D]$ in the full $N_\\omega\\gg N_\\tau$ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width ...

  19. Efficient reconstruction of contaminant release history

    Energy Technology Data Exchange (ETDEWEB)

    Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL

    2009-01-01

    We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.

  20. Bayesian reconstruction strategy of fluorescence-mediated tomography using an integrated SPECT-CT-OT system

    Science.gov (United States)

    Cao, Liji; Peter, Jörg

    2010-05-01

    Following the assembly of a triple-modality SPECT-CT-OT small animal imaging system providing intrinsically co-registered projection data of all three submodalities and under the assumption and investigation of dual-labeled probes consisting of both fluorophores and radionuclides, a novel multi-modal reconstruction strategy is presented in this paper aimed at improving fluorescence-mediated tomography (FMT). The following reconstruction procedure is proposed: firstly, standard x-ray CT image reconstruction is performed employing the FDK algorithm. Secondly, standard SPECT image reconstruction is performed using OSEM. Thirdly, from the reconstructed CT volume data the surface boundary of the imaged object is extracted for finite element definition. Finally, the reconstructed SPECT data are used as a priori information within a Bayesian reconstruction framework for optical (FMT) reconstruction. We provide results of this multi-modal approach using phantom experimental data and illustrate that this strategy does suppress artifacts and facilitates quantitative analysis for optical imaging studies.

  1. Bayesian inference of the demographic history of chimpanzees.

    Science.gov (United States)

    Wegmann, Daniel; Excoffier, Laurent

    2010-06-01

    Due to an almost complete absence of fossil record, the evolutionary history of chimpanzees has only been studied recently on the basis of genetic data. Although the general topology of the chimpanzee phylogeny is well established, uncertainties remain concerning the size of current and past populations, the occurrence of bottlenecks or population expansions, or about divergence times and migrations rates between subspecies. Here, we present a novel attempt at globally inferring the detailed evolution of the Pan genus based on approximate Bayesian computation, an approach preferentially applied to complex models where the likelihood cannot be computed analytically. Based on two microsatellite and DNA sequence data sets and adjusting simulated data for local levels of inbreeding and patterns of missing data, we find support for several new features of chimpanzee evolution as compared with previous studies based on smaller data sets and simpler evolutionary models. We find that the central chimpanzees are certainly the oldest population of all P. troglodytes subspecies and that the other two P. t. subspecies diverged from the central chimpanzees by founder events. We also find an older divergence time (1.6 million years [My]) between common chimpanzee and Bonobos than previous studies (0.9-1.3 My), but this divergence appears to have been very progressive with the maintenance of relatively high levels of gene flow between the ancestral chimpanzee population and the Bonobos. Finally, we could also confirm the existence of strong unidirectional gene flow from the western into the central chimpanzee. These results show that interesting and innovative features of chimpanzee history emerge when considering their whole evolutionary history in a single analysis, rather than relying on simpler models involving several comparisons of pairs of populations. PMID:20118191

  2. A novel reconstruction algorithm for bioluminescent tomography based on Bayesian compressive sensing

    Science.gov (United States)

    Wang, Yaqi; Feng, Jinchao; Jia, Kebin; Sun, Zhonghua; Wei, Huijun

    2016-03-01

    Bioluminescence tomography (BLT) is becoming a promising tool because it can resolve the biodistribution of bioluminescent reporters associated with cellular and subcellular function through several millimeters with to centimeters of tissues in vivo. However, BLT reconstruction is an ill-posed problem. By incorporating sparse a priori information about bioluminescent source, enhanced image quality is obtained for sparsity based reconstruction algorithm. Therefore, sparsity based BLT reconstruction algorithm has a great potential. Here, we proposed a novel reconstruction method based on Bayesian compressive sensing and investigated its feasibility and effectiveness with a heterogeneous phantom. The results demonstrate the potential and merits of the proposed algorithm.

  3. Reconstructing the history of dark energy using maximum entropy

    OpenAIRE

    Zunckel, C.; Trotta, R.

    2007-01-01

    We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS pro...

  4. Sparse reconstruction using distribution agnostic bayesian matching pursuit

    KAUST Repository

    Masood, Mudassir

    2013-11-01

    A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.

  5. Comparing Nonparametric Bayesian Tree Priors for Clonal Reconstruction of Tumors

    OpenAIRE

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2014-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstructio...

  6. Texture-preserving Bayesian image reconstruction for low-dose CT

    Science.gov (United States)

    Zhang, Hao; Han, Hao; Hu, Yifan; Liu, Yan; Ma, Jianhua; Li, Lihong; Moore, William; Liang, Zhengrong

    2016-03-01

    Markov random field (MRF) model has been widely used in Bayesian image reconstruction to reconstruct piecewise smooth images in the presence of noise, such as in low-dose X-ray computed tomography (LdCT). While it can preserve edge sharpness via edge-preserving potential function, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it compromises clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodule or colon polyp. This study aims to shift the edge preserving regional noise smoothing paradigm to texture-preserving framework for LdCT image reconstruction while retaining the advantage of MRF's neighborhood system on edge preservation. Specifically, we adapted the MRF model to incorporate the image textures of lung, bone, fat, muscle, etc. from previous full-dose CT scan as a priori knowledge for texture-preserving Bayesian reconstruction of current LdCT images. To show the feasibility of proposed reconstruction framework, experiments using clinical patient scans (with lung nodule or colon polyp) were conducted. The experimental outcomes showed noticeable gain by the a priori knowledge for LdCT image reconstruction with the well-known Haralick texture measures. Thus, it is conjectured that texture-preserving LdCT reconstruction has advantages over edge-preserving regional smoothing paradigm for texture-specific clinical applications.

  7. A novel Bayesian approach to spectral function reconstruction

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the MEM. We present a realistic test of our method in the context of the non-perturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. An improved potential estimation from previously investigated quenched lattice QCD correlators is provided.

  8. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author)

  9. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    Science.gov (United States)

    Tang, Jing; Rahmim, Arman

    2009-12-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  10. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Jinyi

    2003-01-10

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE.

  11. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  12. Bayesian Reconstruction of the Velocity Distribution of Weakly Interacting Massive Particles from Direct Dark Matter Detection Data

    CERN Document Server

    Shan, Chung-Lin

    2014-01-01

    In this paper, we extended our earlier work on the reconstruction of the (time-averaged) one-dimensional velocity distribution of Galactic Weakly Interacting Massive Particles (WIMPs) and introduce the Bayesian fitting procedure to the theoretically predicted velocity distribution functions. In this reconstruction process, the (rough) velocity distribution reconstructed by using raw data from direct Dark Matter detection experiments directly, i.e. measured recoil energies, with one or more different target materials, has been used as "reconstructed-input" information. By assuming a fitting velocity distribution function and scanning the parameter space based on the Bayesian analysis, the astronomical characteristic parameters, e.g. the Solar and Earth's orbital velocities, will be pinned down as the output results. Our Monte-Carlo simulations show that this Bayesian scanning procedure could reconstruct the true (input) WIMP velocity distribution function pretty precisely with negligible systematic deviations ...

  13. A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change

    CERN Document Server

    Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...

  14. Bayesian inference of population size history from multiple loci

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2008-10-01

    Full Text Available Abstract Background Effective population size (Ne is related to genetic variability and is a basic parameter in many models of population genetics. A number of methods for inferring current and past population sizes from genetic data have been developed since JFC Kingman introduced the n-coalescent in 1982. Here we present the Extended Bayesian Skyline Plot, a non-parametric Bayesian Markov chain Monte Carlo algorithm that extends a previous coalescent-based method in several ways, including the ability to analyze multiple loci. Results Through extensive simulations we show the accuracy and limitations of inferring population size as a function of the amount of data, including recovering information about evolutionary bottlenecks. We also analyzed two real data sets to demonstrate the behavior of the new method; a single gene Hepatitis C virus data set sampled from Egypt and a 10 locus Drosophila ananassae data set representing 16 different populations. Conclusion The results demonstrate the essential role of multiple loci in recovering population size dynamics. Multi-locus data from a small number of individuals can precisely recover past bottlenecks in population size which can not be characterized by analysis of a single locus. We also demonstrate that sequence data quality is important because even moderate levels of sequencing errors result in a considerable decrease in estimation accuracy for realistic levels of population genetic variability.

  15. A hierarchical Bayesian approach for reconstructing the initial mass function of single stellar populations

    Science.gov (United States)

    Dries, M.; Trager, S. C.; Koopmans, L. V. E.

    2016-11-01

    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach, we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities and IMFs. When systematic uncertainties are not significant, we are able to reconstruct the input parameters that were used to create the mock populations. Our results show that if systematic uncertainties do play a role, this may introduce a bias on the results. Therefore, it is important to objectively compare different ingredients of SPS models. Through its Bayesian framework, our model is well suited for this.

  16. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  17. Bayesian three-dimensional reconstruction of toothed whale trajectories: Passive acoustics assisted with visual and tagging measurements

    OpenAIRE

    Laplanche, Christophe

    2012-01-01

    International audience The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error st...

  18. MR Image Reconstruction from Undersampled k-Space with Bayesian Dictionary Learning

    CERN Document Server

    Huang, Yue; Chen, Xianbo; Ding, Xinghao; Huang, Feng; Zhang, Xiao-ping

    2013-01-01

    We develop an algorithm for reconstructing magnetic resonance images (MRI) from highly undersampled k-space data. While existing methods focus on either image-level or patch-level sparse regularization strategies, we present a regularization framework that uses both image and patch-level sparsity constraints. The proposed regularization enforces image-level sparsity in terms of spatial finite differences (total variation) and patch-wise sparsity through in situ dictionary learning. We use the beta-Bernoulli process as a Bayesian prior for dictionary learning, which adaptively infers the dictionary size, the sparsity of each patch and the noise parameters. In addition, we employ an efficient numerical algorithm based on the alternating direction method of multipliers (ADMM). We present empirical results on several MR images, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  19. A hierarchical Bayesian approach for reconstructing the Initial Mass Function of Single Stellar Populations

    CERN Document Server

    Dries, M; Koopmans, L V E

    2016-01-01

    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...

  20. Improved reservoir sizing utilizing observed and reconstructed streamflows within a Bayesian combination framework

    Science.gov (United States)

    Patskoski, Jason; Sankarasubramanian, A.

    2015-07-01

    Reservoir sizing is a critical task as the storage in a reservoir must be sufficient to supply water during extended droughts. Typically, sequent peak algorithm (SQP) is used with observed streamflow to obtain reservoir storage estimates. To overcome the limited sample length of observed streamflow, synthetic streamflow traces estimated from observed streamflow characteristics are provided with SQP to estimate the distribution of storage. However, the parameters in the stochastic streamflow generation model are derived from the observed record and are still unrepresentative of the long-term drought records. Paleo-streamflow time series, usually reconstructed using tree-ring chronologies, span for a longer period than the observed streamflow and provide additional insight into the preinstrumental drought record. This study investigates the capability of reconstructed streamflow records in reducing the uncertainty in reservoir storage estimation. For this purpose, we propose a Bayesian framework that combines observed and reconstructed streamflow for estimating the parameters of the stochastic streamflow generation model. By utilizing reconstructed streamflow records from two potential stations over the Southeastern U.S., the distribution of storage estimated using the combined streamflows is compared with the distribution of storage estimated using observed streamflow alone based on split-sample validation. Results show that combining observed and reconstructed streamflow yield stochastic streamflow generation parameters more representative of the longer streamflow record resulting in improved reservoir storage estimates. We also generalize the findings through a synthetic experiment by generating reconstructed streamflow records of different sample length and skill. The analysis shows that uncertainty in storage estimates reduces by incorporating reconstruction records with higher skill and longer sample lengths. Potential applications of the proposed methodology

  1. Multi-view TWRI scene reconstruction using a joint Bayesian sparse approximation model

    Science.gov (United States)

    Tang, V. H.; Bouzerdoum, A.; Phung, S. L.; Tivive, F. H. C.

    2015-05-01

    This paper addresses the problem of scene reconstruction in conjunction with wall-clutter mitigation for com- pressed multi-view through-the-wall radar imaging (TWRI). We consider the problem where the scene behind- the-wall is illuminated from different vantage points using a different set of frequencies at each antenna. First, a joint Bayesian sparse recovery model is employed to estimate the antenna signal coefficients simultaneously, by exploiting the sparsity and inter-signal correlations among antenna signals. Then, a subspace-projection technique is applied to suppress the signal coefficients related to the wall returns. Furthermore, a multi-task linear model is developed to relate the target coefficients to the image of the scene. The composite image is reconstructed using a joint Bayesian sparse framework, taking into account the inter-view dependencies. Experimental results are presented which demonstrate the effectiveness of the proposed approach for multi-view imaging of indoor scenes using a reduced set of measurements at each view.

  2. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Lensu, Sanna [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Department of Environmental Health, National Public Health Institute, Kuopio (Finland); Jolkkonen, Jukka [Department of Neuroscience and Neurology, University of Kuopio, Kuopio (Finland); Tuomisto, Leena [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, Tampere (Finland); Kuikka, Jyrki T. [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Niuvanniemi Hospital, Kuopio (Finland)

    2004-07-01

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with {sup 123}I-epidepride and {sup 99m}Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)

  3. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction

    International Nuclear Information System (INIS)

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with 123I-epidepride and 99mTc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)

  4. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction.

    Science.gov (United States)

    Sohlberg, Antti; Lensu, Sanna; Jolkkonen, Jukka; Tuomisto, Leena; Ruotsalainen, Ulla; Kuikka, Jyrki T

    2004-07-01

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with (123)I-epidepride and (99m)Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. PMID:14991246

  5. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.

    Science.gov (United States)

    Zanini, Andrea; Woodbury, Allan D

    2016-01-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  6. Inferring population history with DIYABC: a user-friendly approach to Approximate Bayesian Computation

    OpenAIRE

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in...

  7. Task performance on constrained reconstructions: human observer performance compared with suboptimal Bayesian performance

    Science.gov (United States)

    Wagner, Robert F.; Myers, Kyle J.; Hanson, Kenneth M.

    1992-06-01

    We have previously described how imaging systems and image reconstruction algorithms can be evaluated on the basis of how well binary-discrimination tasks can be performed by a machine algorithm that `views' the reconstructions. Algorithms used in these investigations have been based on approximations to the ideal observer of Bayesian statistical decision theory. The present work examines the performance of an extended family of such algorithmic observers viewing tomographic images reconstructed from a small number of views using the Cambridge Maximum Entropy software, MEMSYS 3. We investigate the effects on the performance of these observers due to varying the parameter (alpha) ; this parameter controls the stopping point of the iterative reconstruction technique and effectively determines the smoothness of the reconstruction. For the detection task considered here, performance is maximum at the lowest values of (alpha) studied; these values are encountered as one moves toward the limit of maximum likelihood estimation while maintaining the positivity constraint intrinsic to entropic priors. A breakdown in the validity of a Gaussian approximation used by one of the machine algorithms (the posterior probability) was observed in this region. Measurements on human observers performing the same task show that they perform comparably to the best machine observers in the region of highest machine scores, i.e., smallest values of (alpha) . For increasing values of (alpha) , both human and machine observer performance degrade. The falloff in human performance is more rapid than that of the machine observer at the largest values of (alpha) (lowest performance) studied. This behavior is common to all such studies of the so-called psychometric function.

  8. Reconstruction of a windborne insect invasion using a particle dispersal model, historical wind data, and Bayesian analysis of genetic data.

    Science.gov (United States)

    Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas

    2014-12-01

    Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data

  9. Bayesian three-dimensional reconstruction of toothed whale trajectories: passive acoustics assisted with visual and tagging measurements.

    Science.gov (United States)

    Laplanche, Christophe

    2012-11-01

    The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error statistical approach by comparing the reconstruction results of 48 simulated sperm whale (Physeter macrocephalus) trajectories. The use of the advanced Bayesian method reduces bias (standard deviation) with respect to the standard method up to a factor of 8.9 (13.6). The author provides open-source software which is functional with acoustic data which would be collected in the field from any three-dimensional receptor array design. This approach renews passive acoustics as a valuable tool to study the underwater behavior of toothed whales. PMID:23145606

  10. Sparse Bayesian framework applied to 3D super-resolution reconstruction in fetal brain MRI

    Science.gov (United States)

    Becerra, Laura C.; Velasco Toledo, Nelson; Romero Castro, Eduardo

    2015-01-01

    Fetal Magnetic Resonance (FMR) is an imaging technique that is becoming increasingly important as allows assessing brain development and thus make an early diagnostic of congenital abnormalities, spatial resolution is limited by the short acquisition time and the unpredictable fetus movements, in consequence the resulting images are characterized by non-parallel projection planes composed by anisotropic voxels. The sparse Bayesian representation is a flexible strategy which is able to model complex relationships. The Super-resolution is approached as a regression problem, the main advantage is the capability to learn data relations from observations. Quantitative performance evaluation was carried out using synthetic images, the proposed method demonstrates a better reconstruction quality compared with standard interpolation approach. The presented method is a promising approach to improve the information quality related with the 3-D fetal brain structure. It is important because allows assessing brain development and thus make an early diagnostic of congenital abnormalities.

  11. Reconstructing the Star Formation History of the Galaxy

    CERN Document Server

    Hernández, X; Gilmore, G

    2000-01-01

    The evolution of the star formation rate in the Galaxy is one of the keyingredients quantifying the formation and determining the chemical andluminosity evolution of galaxies. Many complementary methods exist to infer thestar formation history of the components of the Galaxy, from indirect methodsfor analysis of low-precision data, to new exact analytic methods for analysisof sufficiently high quality data. We summarise available general constraintson star formation histories, showing that derived star formation rates are ingeneral comparable to those seen today. We then show how colour-magnitudediagrams of volume- and absolute magnitude-limited samples of the solarneighbourhood observed by Hipparcos may be analysed, using variational calculustechniques, to reconstruct the local star formation history. The remarkableaccuracy of the data coupled to our maximum-likelihood variational methodallows objective quantification of the local star formation history with a timeresolution of ~ 50 Myr. Over the past 3Gyr, ...

  12. A Brief History of Anterior Cruciate Ligament Reconstruction

    Directory of Open Access Journals (Sweden)

    Nikolaos Davarinos

    2014-01-01

    Full Text Available Reconstructions of the anterior cruciate ligament (ACL are among the most frequently performed procedures in knee surgery nowadays. The history of ACL surgery can be traced as far back as the Egyptian times. The early years reflect the efforts to establish a viable, consistently successful reconstruction technique while, during the early 20th century, we witness an increasing awareness of, and interest in, the ligament and its lesions. Finally, we highlight the most important steps in the evolution of the ACL reconstruction surgery by discussing the various techniques spanning the years using not only autologous grafts (fascia lata, meniscal, hamstring, patella tendon, bone-patella tendon-bone, and double bundle grafts but also synthetic ones and allografts.

  13. Reconstructing the history of dark energy using maximum entropy

    Science.gov (United States)

    Zunckel, Caroline; Trotta, Roberto

    2007-09-01

    We present a Bayesian technique based on a maximum-entropy method to reconstruct the dark energy equation of state (EOS) w(z) in a non-parametric way. This Maximum Entropy (MaxEnt) technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing Type Ia supernova measurement from the HST/GOODS programme and the first-year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillation data. We find that the SNLS data are compatible with w(z) = -1 at all redshifts 0 -1 at z ~ 0.5 and a drift towards w > -1 at larger redshifts which, however, is not robust with respect to changes in our prior specifications. We employ both a constant EOS prior model and a slowly varying w(z) and find that our conclusions are only mildly dependent on this choice at high redshifts. Our method highlights the danger of employing parametric fits for the unknown EOS, that can potentially miss or underestimate real structure in the data.

  14. Reconstructing the history of dark energy using maximum entropy

    CERN Document Server

    Zunckel, C

    2007-01-01

    We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS program and the first year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillations data. We find that the SNLS data are compatible with $w(z) = -1$ at all redshifts $0 \\leq z \\lsim 1100$, with errorbars of order 20% for the most constraining choice of priors and model. The HST/GOODS data exhibit a slight (about $1\\sigma$ significance) preference for $w>-1$ at $z\\sim 0.5$ and a drift towards $w>-1$ at larger redshifts, which however is not robust with respect to changes ...

  15. Bayesian non-negative factor analysis for reconstructing transcription factor mediated regulatory networks

    Directory of Open Access Journals (Sweden)

    Chen Yidong

    2011-10-01

    Full Text Available Abstract Background Transcriptional regulation by transcription factor (TF controls the time and abundance of mRNA transcription. Due to the limitation of current proteomics technologies, large scale measurements of protein level activities of TFs is usually infeasible, making computational reconstruction of transcriptional regulatory network a difficult task. Results We proposed here a novel Bayesian non-negative factor model for TF mediated regulatory networks. Particularly, the non-negative TF activities and sample clustering effect are modeled as the factors from a Dirichlet process mixture of rectified Gaussian distributions, and the sparse regulatory coefficients are modeled as the loadings from a sparse distribution that constrains its sparsity using knowledge from database; meantime, a Gibbs sampling solution was developed to infer the underlying network structure and the unknown TF activities simultaneously. The developed approach has been applied to simulated system and breast cancer gene expression data. Result shows that, the proposed method was able to systematically uncover TF mediated transcriptional regulatory network structure, the regulatory coefficients, the TF protein level activities and the sample clustering effect. The regulation target prediction result is highly coordinated with the prior knowledge, and sample clustering result shows superior performance over previous molecular based clustering method. Conclusions The results demonstrated the validity and effectiveness of the proposed approach in reconstructing transcriptional networks mediated by TFs through simulated systems and real data.

  16. Accounting for variation of substitution rates through time in Bayesian phylogeny reconstruction of Sapotoideae (Sapotaceae).

    Science.gov (United States)

    Smedmark, Jenny E E; Swenson, Ulf; Anderberg, Arne A

    2006-06-01

    We used Bayesian phylogenetic analysis of 5 kb of chloroplast DNA data from 68 Sapotaceae species to clarify phylogenetic relationships within Sapotoideae, one of the two major clades within Sapotaceae. Variation in substitution rates through time was shown to be a very important aspect of molecular evolution for this data set. Relative rates tests indicated that changes in overall rate have taken place in several lineages during the history of the group and Bayes factors strongly supported a covarion model, which allows the rate of a site to vary over time, over commonly used models that only allow rates to vary across sites. Rate variation over time was actually found to be a more important model component than rate variation across sites. The covarion model was originally developed for coding gene sequences and has so far only been tested for this type of data. The fact that it performed so well with the present data set, consisting mainly of data from noncoding spacer regions, suggests that it deserves a wider consideration in model based phylogenetic inference. Repeatability of phylogenetic results was very difficult to obtain with the more parameter rich models, and analyses with identical settings often supported different topologies. Overparameterization may be the reason why the MCMC did not sample from the posterior distribution in these cases. The problem could, however, be overcome by using less parameter rich evolutionary models, and adjusting the MCMC settings. The phylogenetic results showed that two taxa, previously thought to belong in Sapotoideae, are not part of this group. Eberhardtia aurata is the sister of the two major Sapotaceae clades, Chrysophylloideae and Sapotoideae, and Neohemsleya usambarensis belongs in Chrysophylloideae. Within Sapotoideae two clades, Sideroxyleae and Sapoteae, were strongly supported. Bayesian analysis of the character history of some floral morphological traits showed that the ancestral type of flower in

  17. Application of Bayesian Neural Networks to Energy Reconstruction in EAS Experiments for ground-based TeV Astrophysics

    CERN Document Server

    Bai, Ying; Lan, JieQin; Gao, WeiWei

    2016-01-01

    A toy detector array has been designed to simulate the detection of cosmic rays in Extended Air Shower(EAS) Experiments for ground-based TeV Astrophysics. The primary energies of protons from the Monte-Carlo simulation have been reconstructed by the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment\\cite{lhaaso-ma}, respectively. The result of the energy reconstruction using BNNs has been compared with the one using the standard method. Compared to the standard method, the energy resolutions are significantly improved using BNNs. And the improvement is more obvious for the high energy protons than the low energy ones.

  18. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Chan, M.T. [Univ. of Southern California, Los Angeles, CA (United States); Herman, G.T. [Univ. of Pennsylvania, Philadelphia, PA (United States); Levitan, E. [Technion, Haifa (Israel)

    1996-12-31

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an {open_quotes}image-modeling{close_quotes} prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach.

  19. Gene regulatory network reconstruction by Bayesian integration of prior knowledge and/or different experimental conditions.

    Science.gov (United States)

    Werhli, Adriano V; Husmeier, Dirk

    2008-06-01

    There have been various attempts to improve the reconstruction of gene regulatory networks from microarray data by the systematic integration of biological prior knowledge. Our approach is based on pioneering work by Imoto et al. where the prior knowledge is expressed in terms of energy functions, from which a prior distribution over network structures is obtained in the form of a Gibbs distribution. The hyperparameters of this distribution represent the weights associated with the prior knowledge relative to the data. We have derived and tested a Markov chain Monte Carlo (MCMC) scheme for sampling networks and hyperparameters simultaneously from the posterior distribution, thereby automatically learning how to trade off information from the prior knowledge and the data. We have extended this approach to a Bayesian coupling scheme for learning gene regulatory networks from a combination of related data sets, which were obtained under different experimental conditions and are therefore potentially associated with different active subpathways. The proposed coupling scheme is a compromise between (1) learning networks from the different subsets separately, whereby no information between the different experiments is shared; and (2) learning networks from a monolithic fusion of the individual data sets, which does not provide any mechanism for uncovering differences between the network structures associated with the different experimental conditions. We have assessed the viability of all proposed methods on data related to the Raf signaling pathway, generated both synthetically and in cytometry experiments. PMID:18574862

  20. Benchmarking the Bayesian reconstruction of the non-perturbative heavy $Q\\bar{Q}$ potential

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    The extraction of the finite temperature heavy quark potential from lattice QCD relies on a spectral analysis of the real-time Wilson loop. Through its position and shape, the lowest lying spectral peak encodes the real and imaginary part of this complex potential. We benchmark this extraction strategy using leading order hard-thermal loop (HTL) calculations. I.e. we analytically calculate the Wilson loop and determine the corresponding spectrum. By fitting its lowest lying peak we obtain the real- and imaginary part and confirm that the knowledge of the lowest peak alone is sufficient for obtaining the potential. We deploy a novel Bayesian approach to the reconstruction of spectral functions to HTL correlators in Euclidean time and observe how well the known spectral function and values for the real and imaginary part are reproduced. Finally we apply the method to quenched lattice QCD data and perform an improved estimate of both real and imaginary part of the non-perturbative heavy $Q\\bar{Q}$ potential.

  1. A Bayesian fusion model for space-time reconstruction of finely resolved velocities in turbulent flows from low resolution measurements

    CERN Document Server

    Van Nguyen, Linh; Chainais, Pierre

    2015-01-01

    The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities ar...

  2. Reconstructing the Population Genetic History of the Caribbean

    Science.gov (United States)

    Moreno-Estrada, Andrés; Gravel, Simon; Zakharia, Fouad; McCauley, Jacob L.; Byrnes, Jake K.; Gignoux, Christopher R.; Ortiz-Tello, Patricia A.; Martínez, Ricardo J.; Hedges, Dale J.; Morris, Richard W.; Eng, Celeste; Sandoval, Karla; Acevedo-Acevedo, Suehelay; Norman, Paul J.; Layrisse, Zulay; Parham, Peter; Martínez-Cruzado, Juan Carlos; Burchard, Esteban González; Cuccaro, Michael L.; Martin, Eden R.; Bustamante, Carlos D.

    2013-01-01

    The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola), two mainland (Honduras, Colombia), and three Native South American (Yukpa, Bari, and Warao) populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA) method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse—which today is reflected by shorter, older ancestry tracts—consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse—reflected by longer, younger tracts—is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub

  3. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cai, C. [CEA, LIST, 91191 Gif-sur-Yvette, France and CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Rodet, T.; Mohammad-Djafari, A. [CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Legoupil, S. [CEA, LIST, 91191 Gif-sur-Yvette (France)

    2013-11-15

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  4. Application of Bayesian neural networks to energy reconstruction in EAS experiments for ground-based TeV astrophysics

    Science.gov (United States)

    Bai, Y.; Xu, Y.; Pan, J.; Lan, J. Q.; Gao, W. W.

    2016-07-01

    A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers than the low energy ones.

  5. Typical reconstruction performance for distributed compressed sensing based on ℓ2,1-norm regularized least square and Bayesian optimal reconstruction: influences of noise

    Science.gov (United States)

    Shiraki, Yoshifumi; Kabashima, Yoshiyuki

    2016-06-01

    A signal model called joint sparse model 2 (JSM-2) or the multiple measurement vector problem, in which all sparse signals share their support, is important for dealing with practical signal processing problems. In this paper, we investigate the typical reconstruction performance of noisy measurement JSM-2 problems for {{\\ell}2,1} -norm regularized least square reconstruction and the Bayesian optimal reconstruction scheme in terms of mean square error. Employing the replica method, we show that these schemes, which exploit the knowledge of the sharing of the signal support, can recover the signals more precisely as the number of channels increases. In addition, we compare the reconstruction performance of two different ensembles of observation matrices: one is composed of independent and identically distributed random Gaussian entries and the other is designed so that row vectors are orthogonal to one another. As reported for the single-channel case in earlier studies, our analysis indicates that the latter ensemble offers better performance than the former ones for the noisy JSM-2 problem. The results of numerical experiments with a computationally feasible approximation algorithm we developed for this study agree with the theoretical estimation.

  6. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole;

    2011-01-01

    We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical...... models. Analysis of simulated and real EEG data provide evidence that reconstruction of the forward model leads to improved source estimates....

  7. Avoiding spurious feedback loops in the reconstruction of gene regulatory networks with dynamic bayesian networks

    OpenAIRE

    Grzegorczyk, M.; Husmeier, D.

    2009-01-01

    Feedback loops and recurrent structures are essential to the regulation and stable control of complex biological systems. The application of dynamic as opposed to static Bayesian networks is promising in that, in principle, these feedback loops can be learned. However, we show that the widely applied BGe score is susceptible to learning spurious feedback loops, which are a consequence of non-linear regulation and autocorrelation in the data. We propose a non-linear generalisation of the BGe m...

  8. Awakening the BALROG (BAyesian Location Reconstruction Of GRBs): A new paradigm in spectral and location analysis of gamma ray bursts

    CERN Document Server

    Burgess, J Michael; Greiner, Jochen; Mortlock, Daniel J

    2016-01-01

    The accurate spatial location of gamma-ray bursts (GRBs) is crucial for both producing a detector response matrix (DRM) and follow-up observations by other instruments. The Fermi Gamma-ray Burst Monitor (GBM) has the largest field of view (FOV) for detecting GRBs as it views the entire unocculted sky, but as a non-imaging instrument it relies on the relative count rates observed in each of its 14 detectors to localize transients. Improving its ability to accurately locate GRBs and other transients is vital to the paradigm of multi-messenger astronomy, including the electromagnetic follow-up of gravitational wave signals. Here we present the BAyesian Location Reconstruction Of GRBs ({\\tt BALROG}) method for localizing and characterising GBM transients. Our approach eliminates the systematics of previous approaches by simultaneously fitting for the location and spectrum of a source. It also correctly incorporates the uncertainties in the location of a transient into the spectral parameters and produces reliable...

  9. Strontium Isotopes and the Reconstruction of the Chaco Regional System: Evaluating Uncertainty with Bayesian Mixing Models

    Science.gov (United States)

    Drake, Brandon Lee; Wills, Wirt H.; Hamilton, Marian I.; Dorshow, Wetherbee

    2014-01-01

    Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts. In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studies using these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the East. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated) and Bayesian methods (to address uncertainty in geochemical source attribution). It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous. PMID:24854352

  10. Strontium isotopes and the reconstruction of the Chaco regional system: evaluating uncertainty with Bayesian mixing models.

    Directory of Open Access Journals (Sweden)

    Brandon Lee Drake

    Full Text Available Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated and Bayesian methods (to address uncertainty in geochemical source attribution. It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.

  11. Reconstruction of large-scale gene regulatory networks using Bayesian model averaging.

    Science.gov (United States)

    Kim, Haseong; Gelenbe, Erol

    2012-09-01

    Gene regulatory networks provide the systematic view of molecular interactions in a complex living system. However, constructing large-scale gene regulatory networks is one of the most challenging problems in systems biology. Also large burst sets of biological data require a proper integration technique for reliable gene regulatory network construction. Here we present a new reverse engineering approach based on Bayesian model averaging which attempts to combine all the appropriate models describing interactions among genes. This Bayesian approach with a prior based on the Gibbs distribution provides an efficient means to integrate multiple sources of biological data. In a simulation study with maximum of 2000 genes, our method shows better sensitivity than previous elastic-net and Gaussian graphical models, with a fixed specificity of 0.99. The study also shows that the proposed method outperforms the other standard methods for a DREAM dataset generated by nonlinear stochastic models. In brain tumor data analysis, three large-scale networks consisting of 4422 genes were built using the gene expression of non-tumor, low and high grade tumor mRNA expression samples, along with DNA-protein binding affinity information. We found that genes having a large variation of degree distribution among the three tumor networks are the ones that see most involved in regulatory and developmental processes, which possibly gives a novel insight concerning conventional differentially expressed gene analysis. PMID:22987132

  12. Reconstructing the History of Music Education from a Feminist Perspective.

    Science.gov (United States)

    Howe, Sondra Wieland

    1998-01-01

    Argues that there should be a broader definition of the history of music education by challenging the traditional focus of music education history. Believes that there are four alternative perspectives to the canon in music education: (1) African-Americans; (2) the female experience; (3) African-American women; and (4) music in oral traditions.…

  13. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole;

    2009-01-01

    In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....

  14. WHOOMP! (There It Is) Rapid Bayesian position reconstruction for gravitational-wave transients

    CERN Document Server

    Singer, Leo P

    2015-01-01

    Within the next few years, Advanced LIGO and Virgo should detect gravitational waves (GWs) from binary neutron star and neutron star-black hole mergers. These sources are also predicted to power a broad array of electromagnetic transients. Because the X-ray and optical signatures can be faint and fade rapidly, observing them hinges on rapidly inferring the sky location from the gravitational wave observations. Markov chain Monte Carlo (MCMC) methods for gravitational-wave parameter estimation can take hours or more. We introduce BAYESTAR, a rapid, Bayesian, non-MCMC sky localization algorithm that takes just seconds to produce probability sky maps that are comparable in accuracy to the full analysis. Prompt localizations from BAYESTAR will make it possible to search electromagnetic counterparts of compact binary mergers.

  15. The confounding effect of population structure on bayesian skyline plot inferences of demographic history

    DEFF Research Database (Denmark)

    Heller, Rasmus; Chikhi, Lounes; Siegismund, Hans

    2013-01-01

    when it is violated. Among the most widely applied demographic inference methods are Bayesian skyline plots (BSPs), which are used across a range of biological fields. Violations of the panmixia assumption are to be expected in many biological systems, but the consequences for skyline plot inferences...

  16. MAGIC: Exact Bayesian Covariance Estimation and Signal Reconstruction for Gaussian Random Fields

    OpenAIRE

    Wandelt, Benjamin D.

    2004-01-01

    In this talk I describe MAGIC, an efficient approach to covariance estimation and signal reconstruction for Gaussian random fields (MAGIC Allows Global Inference of Covariance). It solves a long-standing problem in the field of cosmic microwave background (CMB) data analysis but is in fact a general technique that can be applied to noisy, contaminated and incomplete or censored measurements of either spatial or temporal Gaussian random fields. In this talk I will phrase the method in a way th...

  17. A Bayesian approach for suppression of limited angular sampling artifacts in single particle 3D reconstruction.

    Science.gov (United States)

    Moriya, Toshio; Acar, Erman; Cheng, R Holland; Ruotsalainen, Ulla

    2015-09-01

    In the single particle reconstruction, the initial 3D structure often suffers from the limited angular sampling artifact. Selecting 2D class averages of particle images generally improves the accuracy and efficiency of the reference-free 3D angle estimation, but causes an insufficient angular sampling to fill the information of the target object in the 3D frequency space. Similarly, the initial 3D structure by the random-conical tilt reconstruction has the well-known "missing cone" artifact. Here, we attempted to solve the limited angular sampling problem by sequentially applying maximum a posteriori estimate with expectation maximization algorithm (sMAP-EM). Using both simulated and experimental cryo-electron microscope images, the sMAP-EM was compared to the direct Fourier method on the basis of reconstruction error and resolution. To establish selection criteria of the final regularization weight for the sMAP-EM, the effects of noise level and sampling sparseness on the reconstructions were examined with evenly distributed sampling simulations. The frequency information filled in the missing cone of the conical tilt sampling simulations was assessed by developing new quantitative measurements. All the results of visual and numerical evaluations showed the sMAP-EM performed better than the direct Fourier method, regardless of the sampling method, noise level, and sampling sparseness. Furthermore, the frequency domain analysis demonstrated that the sMAP-EM can fill the meaningful information in the unmeasured angular space without detailed a priori knowledge of the objects. The current research demonstrated that the sMAP-EM has a high potential to facilitate the determination of 3D protein structures at near atomic-resolution. PMID:26193484

  18. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  19. Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.

    Science.gov (United States)

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C). PMID:24237510

  20. A Bayesian semiparametric approach for incorporating longitudinal information on exposure history for inference in case-control studies.

    Science.gov (United States)

    Bhadra, Dhiman; Daniels, Michael J; Kim, Sungduk; Ghosh, Malay; Mukherjee, Bhramar

    2012-06-01

    In a typical case-control study, exposure information is collected at a single time point for the cases and controls. However, case-control studies are often embedded in existing cohort studies containing a wealth of longitudinal exposure history about the participants. Recent medical studies have indicated that incorporating past exposure history, or a constructed summary measure of cumulative exposure derived from the past exposure history, when available, may lead to more precise and clinically meaningful estimates of the disease risk. In this article, we propose a flexible Bayesian semiparametric approach to model the longitudinal exposure profiles of the cases and controls and then use measures of cumulative exposure based on a weighted integral of this trajectory in the final disease risk model. The estimation is done via a joint likelihood. In the construction of the cumulative exposure summary, we introduce an influence function, a smooth function of time to characterize the association pattern of the exposure profile on the disease status with different time windows potentially having differential influence/weights. This enables us to analyze how the present disease status of a subject is influenced by his/her past exposure history conditional on the current ones. The joint likelihood formulation allows us to properly account for uncertainties associated with both stages of the estimation process in an integrated manner. Analysis is carried out in a hierarchical Bayesian framework using reversible jump Markov chain Monte Carlo algorithms. The proposed methodology is motivated by, and applied to a case-control study of prostate cancer where longitudinal biomarker information is available for the cases and controls. PMID:22313248

  1. Education in Somalia: History, Destruction, and Calls for Reconstruction.

    Science.gov (United States)

    Abdi, Ali A.

    1998-01-01

    Traces the history of education in Somalia: in precolonial traditional Somalia; during colonial rule by Italy; under civilian rule, 1960-69; and under military rule, 1969-90. Describes the total destruction of the education system since the 1991 collapse of the state, widespread illiteracy and adolescent involvement in thuggery, and the urgent…

  2. Reconstructing a School's Past Using Oral Histories and GIS Mapping.

    Science.gov (United States)

    Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna

    2000-01-01

    Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)

  3. Holocene fire history reconstruction using Tibetan lacustrine sediments

    Science.gov (United States)

    Callegaro, Alice; Kirchgeorg, Torben; Battistel, Dario; Bird, Broxton; Barbante, Carlo

    2016-04-01

    The important role that biomass burning playsin influencing the Holocene'sclimate is still under discussion. The present work gives information about past biomass burning events in the Tibetan Plateau and helps to increase the understanding of the interaction between climate, humans and fire activity during Holocene. Asiatic area is one of the centers of the advent of agriculture and pastoralism, and it is a strategic area for understanding the interaction between human and fire during the Holocene. We reconstructed past biomass burning events and vegetation from sediments collected from lake Paru Co, a small moraine dammed lake located in the Tibetan Plateau at 4845 m above sea level. We extracted lake sediment samples by accelerate solvent extraction and analysed different organic molecular proxies by GC-MS and IC-MS. We used monosaccharide anhydrides, levoglucosan and its isomers, as proxies for biomass burning. These are specific molecular markers originated from the pyrolysis of cellulose showing significant fire events and indicate changes in burned fuel. Furthermore we analysed polycyclic aromatic hydrocarbons (PAH) as additional combustion proxies. For a better understanding of changes in vegetation andof human habitation at the lake shore we analysed n-alkanes and sterols. Comparing the data of this multi-proxy approach used in the studied area with climatic and meteorological literature data, reconstruction and contextualization of past fire events are possible: we can see the agreement between dry climate period and presence of more intense fire events, especially in the Early Holocene.

  4. Free Radicals in Organic Matter for Thermal History Reconstruction of Carbonate Succession

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Geothermometer is one of the most useful methods to reconstruct the thermal history of sedimentary basins. This paper introduces the application of free radicals concentration of organic matter as a thermal indicator in the thermal history reconstruction of carbonate succession, based on anhydrous thermal simulation results of type Ⅰ and Ⅱ1 kerogen. A series of free radicals data are obtained under thermal simulation of different heating temperatures and times, and quantitative models between free radical concentration (Ng) of organic matter and time-temperature index (TTI) for types Ⅰ and type Ⅱ1 kerogen are also obtained. This Ng- TTI relation was used to model the Ordovician thermal gradients of Well TZ12 in the Tarim Basin. The modeling result is corresponding to the results obtained by apatite fission track data and published data. This new method of thermal history reconstruction will be benefit to the hydrocarbon generation and accumulation study and resource assessment of carbonate succession.

  5. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Directory of Open Access Journals (Sweden)

    Alexander Tilley

    Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.

  6. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Science.gov (United States)

    Tilley, Alexander; López-Angarita, Juliana; Turner, John R

    2013-01-01

    The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15)N and (13)C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15)N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15)N values and greater δ(13)C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15)N ≈ 2.7‰ and Δ(13)C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions. PMID:24236144

  7. A Comparison of Spatio-Temporal Bayesian Models for Reconstruction of Rainfall Fields in a Cloud Seeding Experiment

    Directory of Open Access Journals (Sweden)

    Sujit k. sahu

    2005-01-01

    Full Text Available In response to the drought experienced in Southern Italy a rain seeding project has been setup and developed during the years 1989-1994. The initiative was taken with the purpose of applying existing methods of rain enhancement technology to regions of south Italy including Puglia. The aim of this study is to provide statistical support for the evaluation of the experimental part of the project. In particular our aim is to reconstruct rainfall fields by combining two data sources: rainfall intensity as measured by ground raingauges and radar reflectivity. A difficulty in modeling the rainfall data here comes from rounding of many recorded rainguages. The rounding of the rainfall measurements make the data essentially discrete and models based on continuous distributions are not suitable for modeling these discrete data. In this study we extend two recently developed spatio-temporal models for continuous data to accommodate rounded rainfall measurements taking discrete values with positive probabilities. We use MCMC methods to implement the models and obtain forecasts in space and time together with their standard errors. We compare the two models using predictive Bayesian methods. The benefits of our modeling extensions are seen in accurate predictions of dry periods with no positive prediction standard errors.

  8. Reconstructing the history of structure formation using redshift distortions

    OpenAIRE

    Song, Yong-Seon; Percival, Will

    2008-01-01

    Measuring the statistics of galaxy peculiar velocities using redshift-space distortions is an excellent way of probing the history of structure formation. Because galaxies are expected to act as test particles within the flow of matter, this method avoids uncertainties due to an unknown galaxy density bias. We show that the parameter combination measured by redshift-space distortions, $f\\sigma_8^{\\rm mass}$ provides a good test of dark energy models, even without the knowledge of bias or $\\si...

  9. Collage as a Way to Reconstruct the History of Education

    OpenAIRE

    Elena Penskaja

    2012-01-01

    Elena Penskaja, D.Sc. in Philology, Dean of the Faculty of Philology, Head of the Literature Department, National Research University - Higher School of Economics, Moscow, Russian Federation. Email: The paper reviews the methods used to describe historical processes in education and analyzes examples of historical material falsification.The lack of qualitative studies in history of Russian education is a great hindrance to educational reforms. The author identifies and descr...

  10. Bayesian field theoretic reconstruction of bond potential and bond mobility in single molecule force spectroscopy

    CERN Document Server

    Chang, Joshua C; Chou, Tom

    2015-01-01

    Quantifying the forces between and within macromolecules is a necessary first step in understanding the mechanics of molecular structure, protein folding, and enzyme function and performance. In such macromolecular settings, dynamic single-molecule force spectroscopy (DFS) has been used to distort bonds. The resulting responses, in the form of rupture forces, work applied, and trajectories of displacements, have been used to reconstruct bond potentials. Such approaches often rely on simple parameterizations of one-dimensional bond potentials, assumptions on equilibrium starting states, and/or large amounts of trajectory data. Parametric approaches typically fail at inferring complex-shaped bond potentials with multiple minima, while piecewise estimation may not guarantee smooth results with the appropriate behavior at large distances. Existing techniques, particularly those based on work theorems, also do not address spatial variations in the diffusivity that may arise from spatially inhomogeneous coupling to...

  11. Bayesian reconstruction of disease outbreaks by combining epidemiologic and genomic data.

    Directory of Open Access Journals (Sweden)

    Thibaut Jombart

    2014-01-01

    Full Text Available Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees ("who infected whom" from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS, providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments.

  12. Reconstruction of the History of the Photoelectric Effect and Its Implications for General Physics Textbooks

    Science.gov (United States)

    Niaz, Mansoor; Klassen, Stephen; McMillan, Barbara; Metz, Don

    2010-01-01

    The photoelectric effect is an important part of general physics textbooks. To study the presentation of this phenomenon, we have reconstructed six essential, history and philosophy of science (HPS)-related aspects of the events that culminated in Einstein proposing his hypothesis of lightquanta and the ensuing controversy within the scientific…

  13. Reconstruction to Progressivism: Booklet 3. Critical Thinking in American History. [Student Manual].

    Science.gov (United States)

    O'Reilly, Kevin

    One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents high school students with supplementary lessons on the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The student booklet begins with a guide to critical…

  14. Reconstruction to Progressivism: Booklet 3. Critical Thinking in American History. Teacher's Guide and Source Materials Envelope.

    Science.gov (United States)

    O'Reilly, Kevin

    One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents supplementary lesson plans for teaching high school students about the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The booklet begins with a guide to…

  15. Metagenomic reconstructions of bacterial CRISPR loci constrain population histories.

    Science.gov (United States)

    Sun, Christine L; Thomas, Brian C; Barrangou, Rodolphe; Banfield, Jillian F

    2016-04-01

    Bacterial CRISPR-Cas systems provide insight into recent population history because they rapidly incorporate, in a unidirectional manner, short fragments (spacers) from coexisting infective virus populations into host chromosomes. Immunity is achieved by sequence identity between transcripts of spacers and their targets. Here, we used metagenomics to study the stability and dynamics of the type I-E CRISPR-Cas locus of Leptospirillum group II bacteria in biofilms sampled over 5 years from an acid mine drainage (AMD) system. Despite recovery of 452,686 spacers from CRISPR amplicons and metagenomic data, rarefaction curves of spacers show no saturation. The vast repertoire of spacers is attributed to phage/plasmid population diversity and retention of old spacers, despite rapid evolution of the targeted phage/plasmid genome regions (proto-spacers). The oldest spacers (spacers found at the trailer end) are conserved for at least 5 years, and 12% of these retain perfect or near-perfect matches to proto-spacer targets. The majority of proto-spacer regions contain an AAG proto-spacer adjacent motif (PAM). Spacers throughout the locus target the same phage population (AMDV1), but there are blocks of consecutive spacers without AMDV1 target sequences. Results suggest long-term coexistence of Leptospirillum with AMDV1 and periods when AMDV1 was less dominant. Metagenomics can be applied to millions of cells in a single sample to provide an extremely large spacer inventory, allow identification of phage/plasmids and enable analysis of previous phage/plasmid exposure. Thus, this approach can provide insights into prior bacterial environment and genetic interplay between hosts and their viruses. PMID:26394009

  16. Refining calibration and predictions of a Bayesian statistical-dynamical model for long term avalanche forecasting using dendrochronological reconstructions

    Science.gov (United States)

    Eckert, Nicolas; Schläppy, Romain; Jomelli, Vincent; Naaim, Mohamed

    2013-04-01

    A crucial step for proposing relevant long-term mitigation measures in long term avalanche forecasting is the accurate definition of high return period avalanches. Recently, "statistical-dynamical" approach combining a numerical model with stochastic operators describing the variability of its inputs-outputs have emerged. Their main interests is to take into account the topographic dependency of snow avalanche runout distances, and to constrain the correlation structure between model's variables by physical rules, so as to simulate the different marginal distributions of interest (pressure, flow depth, etc.) with a reasonable realism. Bayesian methods have been shown to be well adapted to achieve model inference, getting rid of identifiability problems thanks to prior information. An important problem which has virtually never been considered before is the validation of the predictions resulting from a statistical-dynamical approach (or from any other engineering method for computing extreme avalanches). In hydrology, independent "fossil" data such as flood deposits in caves are sometimes confronted to design discharges corresponding to high return periods. Hence, the aim of this work is to implement a similar comparison between high return period avalanches obtained with a statistical-dynamical approach and independent validation data resulting from careful dendrogeomorphological reconstructions. To do so, an up-to-date statistical model based on the depth-averaged equations and the classical Voellmy friction law is used on a well-documented case study. First, parameter values resulting from another path are applied, and the dendrological validation sample shows that this approach fails in providing realistic prediction for the case study. This may be due to the strongly bounded behaviour of runouts in this case (the extreme of their distribution is identified as belonging to the Weibull attraction domain). Second, local calibration on the available avalanche

  17. Memory, History and Narrative: Shifts of Meaning when (Reconstructing the Past

    Directory of Open Access Journals (Sweden)

    Ignacio Brescó de Luna

    2012-05-01

    Full Text Available This paper is devoted to the examination of some socio-cultural dimensions of memory, focusing on narratives as a meditational tool (Vygotsky, 1978 for the construction of past events and attribution of meaning. The five elements of Kenneth Burke’s Grammar of Motives (1969 are taken as a framework for the examination of reconstructions of the past and particularly of histories, namely: 1 the interpretative and reconstructive action of 2 a positioned agent operating 3 through narrative means 4 addressed to particular purposes 5 within a concrete social and temporal scenery. The reflexive character of such approach opens the ground for considering remembering as one kind of act performed within the context of a set of on-going actions, so that remembrances play a directive role for action and so have an unavoidable moral dimension. This is particularly relevant for some kinds of social memory such as history teaching and their effects upon identity.

  18. Equivalent thermal history reconstruction from a partially crystallized glass-ceramic sensor array

    Science.gov (United States)

    Heeg, Bauke

    2015-11-01

    The basic concept of a thermal history sensor is that it records the accumulated exposure to some unknown, typically varying temperature profile for a certain amount of time. Such a sensor is considered to be capable of measuring the duration of several (N) temperature intervals. For this purpose, the sensor deploys multiple (M) sensing elements, each with different temperature sensitivity. At the end of some thermal exposure for a known period of time, the sensor array is read-out and an estimate is made of the set of N durations of the different temperature ranges. A potential implementation of such a sensor was pioneered by Fair et al. [Sens. Actuators, A 141, 245 (2008)], based on glass-ceramic materials with different temperature-dependent crystallization dynamics. In their work, it was demonstrated that an array of sensor elements can be made sensitive to slight differences in temperature history. Further, a forward crystallization model was used to simulate the variations in sensor array response to differences in the temperature history. The current paper focusses on the inverse aspect of temperature history reconstruction from a hypothetical sensor array output. The goal of such a reconstruction is to find an equivalent thermal history that is the closest representation of the true thermal history, i.e., the durations of a set of temperature intervals that result in a set of fractional crystallization values which is closest to the one resulting from the true thermal history. One particular useful simplification in both the sensor model as well as in its practical implementation is the omission of nucleation effects. In that case, least squares models can be used to approximate the sensor response and make reconstruction estimates. Even with this simplification, sensor noise can have a destabilizing effect on possible reconstruction solutions, which is evaluated using simulations. Both regularization and non-negativity constrained least squares

  19. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  20. Gene Transfer and the Reconstruction of Life's Early History from Genomic Data

    Science.gov (United States)

    Gogarten, J. Peter; Fournier, Gregory; Zhaxybayeva, Olga

    2008-03-01

    The metaphor of the unique and strictly bifurcating tree of life, suggested by Charles Darwin, needs to be replaced (or at least amended) to reflect and include processes that lead to the merging of and communication between independent lines of descent. Gene histories include and reflect processes such as gene transfer, symbioses and lineage fusion. No single molecule can serve as a proxy for the tree of life. Individual gene histories can be reconstructed from the growing molecular databases containing sequence and structural information. With some simplifications these gene histories can be represented by furcating trees; however, merging these gene histories into web-like organismal histories, including the transfer of metabolic pathways and cell biological innovations from now-extinct lineages, has yet to be accomplished. Because of these difficulties in interpreting the record retained in molecular sequences, correlations with biochemical fossils and with the geological record need to be interpreted with caution. Advances to detect and pinpoint transfer events promise to untangle at least a few of the intertwined histories of individual genes within organisms and trace them to the organismal ancestors. Furthermore, analysis of the shape of molecular phylogenetic trees may point towards organismal radiations that might reflect early mass extinction events that occurred on a planetary scale.

  1. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  2. Bayesian prediction and adaptive sampling algorithms for mobile sensor networks online environmental field reconstruction in space and time

    CERN Document Server

    Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata

    2016-01-01

    This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...

  3. Joint palaeoclimate reconstruction from pollen data via forward models and climate histories

    Science.gov (United States)

    Parnell, Andrew C.; Haslett, John; Sweeney, James; Doan, Thinh K.; Allen, Judy R. M.; Huntley, Brian

    2016-11-01

    We present a method and software for reconstructing palaeoclimate from pollen data with a focus on accounting for and reducing uncertainty. The tools we use include: forward models, which enable us to account for the data generating process and hence the complex relationship between pollen and climate; joint inference, which reduces uncertainty by borrowing strength between aspects of climate and slices of the core; and dynamic climate histories, which allow for a far richer gamut of inferential possibilities. Through a Monte Carlo approach we generate numerous equally probable joint climate histories, each of which is represented by a sequence of values of three climate dimensions in discrete time, i.e. a multivariate time series. All histories are consistent with the uncertainties in the forward model and the natural temporal variability in climate. Once generated, these histories can provide most probable climate estimates with uncertainty intervals. This is particularly important as attention moves to the dynamics of past climate changes. For example, such methods allow us to identify, with realistic uncertainty, the past century that exhibited the greatest warming. We illustrate our method with two data sets: Laguna de la Roya, with a radiocarbon dated chronology and hence timing uncertainty; and Lago Grande di Monticchio, which contains laminated sediment and extends back to the penultimate glacial stage. The procedure is made available via an open source R package, Bclim, for which we provide code and instructions.

  4. An Object-Based Approach for Fire History Reconstruction by Using Three Generations of Landsat Sensors

    Directory of Open Access Journals (Sweden)

    Thomas Katagis

    2014-06-01

    Full Text Available In this study, the capability of geographic object-based image analysis (GEOBIA in the reconstruction of the recent fire history of a typical Mediterranean area was investigated. More specifically, a semi-automated GEOBIA procedure was developed and tested on archived and newly acquired Landsat Multispectral Scanner (MSS, Thematic Mapper (TM, and Operational Land Imager (OLI images in order to accurately map burned areas in the Mediterranean island of Thasos. The developed GEOBIA ruleset was built with the use of the TM image and then applied to the other two images. This process of transferring the ruleset did not require substantial adjustments or any replacement of the initially selected features used for the classification, thus, displaying reduced complexity in processing the images. As a result, burned area maps of very high accuracy (over 94% overall were produced. In addition to the standard error matrix, the employment of additional measures of agreement between the produced maps and the reference data revealed that “spatial misplacement” was the main source of classification error. It can be concluded that the proposed approach can be potentially used for reconstructing the recent (40-year fire history in the Mediterranean, based on extended time series of Landsat or similar data.

  5. Dendroecological potential of Fabiana imbricata shrub for reconstructing fire history at landscape scale in grasslands

    Science.gov (United States)

    Oddi, Facundo; Ghermandi, Luciana; Lasaponara, Rosa

    2014-05-01

    Fire recurrently affects many of the terrestrial ecosystems causing major implications on the structure and dynamics of vegetation. In fire prone, it is particularly important to know the fire regime for which precise fire records are needed. Dendroecology offers the possibility of obtaining fire occurrence data from woody species and has been widely used in forest ecosystems for fire research. Grasslands are regions with no trees but shrubs could be used to acquire dendroecological information in order to reconstructing fire history at landscape scale. We studied the dendroecological potential of shrub F. imbricata to reconstruct fire history at landscape scale in a fire prone grassland of northwestern Patagonia. To do this, we combined spatio-temporal information of recorded fires within the study area with the age structure of F. imbricata shrublands derived by dendroecology. Sampling sites were located over 2500 ha in San Ramón ranch, 30 km east from Bariloche, Río Negro province, Argentina (latitude -41° 04'; longitude -70° 51'). Shrubland age structure correctly described how fires occurred in the past. Pulses of individuals' recruitment were associated with fire in time and space. A bi-variate analysis showed that F. imbricata recruits individuals during the two years after fire and spatial distribution of pulses coincided with the fire map. In sites without fire data, the age structure allowed the identification of two additional fires. Our results show that shrub F. imbricata can be employed with other data sources such as remote sensing and operational databases to improve knowledge on fire regime in northwestern Patagonia grasslands. In conclusion, we raise the possibility of utilizing shrubs as a dendroecological data source to study fire history in grasslands where tree cover is absent.

  6. Reconstructing the infilling history within Robert Sharp Crater, Mars: Insights from morphology and stratigraphy

    Science.gov (United States)

    Brossier, J.; Le Deit, L.; Hauber, E.; Mangold, N.; Carter, J.; Jaumann, R.

    2015-10-01

    Robert Sharp (133.59°E, -4.12°N) is a 150 km diameter impact crater , located in the equatorial region of Mars, near Gale Crater, where the MSL rover Curiosity landed in August 2012. Using orbital data, an iron chlorine hydroxide named akaganéite that typically forms in highly saline and chlorinated aqueous environments on Earth has been detected in Robert Sharp crater [1]. Interestingly, akaganéite has also been detected in Gale Crater from the ground [2,3]. In order to reconstruct the paleo-environments in the region, we produce a geological map of Robert Sharp(Fig.1). Crater counts provide time constraints on its infilling history.

  7. Backlash against American psychology: an indigenous reconstruction of the history of German critical psychology.

    Science.gov (United States)

    Teo, Thomas

    2013-02-01

    After suggesting that all psychologies contain indigenous qualities and discussing differences and commonalities between German and North American historiographies of psychology, an indigenous reconstruction of German critical psychology is applied. It is argued that German critical psychology can be understood as a backlash against American psychology, as a response to the Americanization of German psychology after WWII, on the background of the history of German psychology, the academic impact of the Cold War, and the trajectory of personal biographies and institutions. Using an intellectual-historical perspective, it is shown how and which indigenous dimensions played a role in the development of German critical psychology as well as the limitations to such an historical approach. Expanding from German critical psychology, the role of the critique of American psychology in various contexts around the globe is discussed in order to emphasize the relevance of indigenous historical research.

  8. A phylogenetic reconstruction of the epidemiological history of canine rabies virus variants in Colombia.

    Science.gov (United States)

    Hughes, Gareth J; Páez, Andrés; Bóshell, Jorge; Rupprecht, Charles E

    2004-03-01

    Historically, canine rabies in Colombia has been caused by two geographically distinct canine variants of rabies virus (RV) which between 1992 and 2002 accounted for approximately 95% of Colombian rabies cases. Genetic variant 1 (GV1) has been isolated up until 1997 in the Central Region and the Department of Arauca, and is now considered extinct through a successful vaccination program. Genetic variant 2 (GV2) has been isolated from the northern Caribbean Region and continues to circulate at present. Here we have analyzed two sets of sequence data based upon either a 147 nucleotide region of the glycoprotein (G) gene or a 258 nucleotide region that combines a fragment of the non-coding intergenic region and a fragment of the polymerase gene. Using both maximum likelihood (ML) and Markov chain Monte Carlo (MCMC) methods we have estimated the time of the most recent common ancestor (MRCA) of the two variants to be between 1983 and 1988. Reconstructions of the population history suggest that GV2 has been circulating in Colombia since the 1960s and that GV1 evolved as a separate lineage from GV2. Estimations of the effective population size at present show the GV2 outbreak to be approximately 20 times greater than that of GV1. Demographic reconstructions were unable to detect a decrease in population size concurrent with the elimination of GV1. We find a raised rate of nucleotide substitution for GV1 gene sequences when compared to that of GV2, although all estimates have wide confidence limits. We demonstrate that phylogenetic reconstructions and sequence analysis can be used to support incidence data from the field in the assessment of RV epidemiology. PMID:15019589

  9. A general reconstruction of the recent expansion history of the universe

    CERN Document Server

    Vitenti, S D P

    2015-01-01

    Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary $(n+1)$-knots spline interpolation. We carry out a Monte Carlo analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and $H(z)$ data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most $10\\%$ of the total uncertainty. In...

  10. An alternative approach to reconstructing organic matter accumulation with contrasting watershed disturbance histories from lake sediments

    International Nuclear Information System (INIS)

    A number of proxies, including carbon to nitrogen ratio (C:N) and stable isotopes (δ13C and δ15N), have been used to reconstruct organic matter (OM) profiles from lake sediments and these proxies individually or in combination cannot clearly discriminate different sources. Here we present an alternative approach to elucidate this problem from lake sediments as a function of watershed scale land use changes. Stable isotope signatures of defined OM sources from the study watersheds, Shawnigan Lake (SHL) and Elk Lake (ELL), were compared with sedimentary proxy records. Results from this study reveal that terrestrial inputs and catchment soil coinciding with the watershed disturbances histories probably contributed in recent trophic enrichment in SHL. In contrast, cultural eutrophication in ELL was partially a result of input from catchment soil (agricultural activities) with significant input from lake primary production as well. Results were consistent in both IsoSource (IsoSource version 1.2 is a Visual Basic program used for source separation, ( (http://www.epa.gov/wed/pages/models/isosource/isosource.htm)) and discriminant analysis (statistical classification technique). - The study shows an alternative approach to reconstruct organic matter accumulation using stable isotopes from lake sediments

  11. Phylogenetic relationships and demographic histories of the Atherinidae in the Eastern Atlantic and Mediterranean Sea re-examined by Bayesian inference.

    Science.gov (United States)

    Pujolar, J M; Zane, L; Congiu, L

    2012-06-01

    The aim of our study is to examine the phylogenetic relationship, divergence times and demographic history of the five close-related Mediterranean and North-eastern Atlantic species/forms of Atherina using the full Bayesian framework for species tree estimation recently implemented in ∗BEAST. The inference is made possible by multilocus data using three mitochondrial genes (12S rRNA, 16S rRNA, control region) and one nuclear gene (rhodopsin) from multiple individuals per species available in GenBank. Bayesian phylogenetic analysis of the complete gene dataset produced a tree with strong support for the monophyly of each species, as well as high support for higher level nodes. An old origin of the Atherina group was suggested (19.2 MY), with deep split events within the Atherinidae predating the Messinian Salinity Crisis. Regional genetic substructuring was observed among populations of A. boyeri, with AMOVA and MultiDimensional Scaling suggesting the existence of five groupings (Atlantic/West Mediterranean, Adriatic, Greece, Black Sea and Tunis). The level of subdivision found might be consequence of the hydrographic isolation within the Mediterranean Sea. Bayesian inference of past demographic histories showed a clear signature of demographic expansion for the European coast populations of A. presbyter, possibly linked to post-glacial colonizations, but not for the Azores/Canary Islands, which is expected in isolated populations because of the impossibility of finding new habitats. Within the Mediterranean, signatures of recent demographic expansion were only found for the Adriatic population of A. boyeri, which could be associated with the relatively recent emergence of the Adriatic Sea. PMID:22425706

  12. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    International Nuclear Information System (INIS)

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints

  13. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    Science.gov (United States)

    Burnier, Yannis; Kaczmarek, Olaf; Rothkopf, Alexander

    2016-01-01

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.

  14. Gene regulatory network reconstruction using Bayesian networks, the Dantzig Selector, the Lasso and their meta-analysis.

    Directory of Open Access Journals (Sweden)

    Matthieu Vignes

    Full Text Available Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth "Dialogue for Reverse Engineering Assessments and Methods" (DREAM5 challenges are aimed at assessing methods and associated algorithms devoted to the inference of biological networks. Challenge 3 on "Systems Genetics" proposed to infer causal gene regulatory networks from different genetical genomics data sets. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis of predicted networks in terms of structure and reliability. The developed meta-analysis was ranked first among the 16 teams participating in Challenge 3A. It paves the way for future extensions of our inference method and more accurate gene network estimates in the context of genetical genomics.

  15. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    CERN Document Server

    Burnier, Yannis; Rothkopf, Alexander

    2014-01-01

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched $32^3\\times N_\\tau$ $(\\beta=7.0,\\xi=3.5)$ lattices with $N_\\tau=24,...,96$, which cover $839{\\rm MeV} \\geq T\\geq 210 {\\rm MeV}$. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize $N_f=2+1$ ASQTAD lattices with $m_l=m_s/20$ by the HotQCD collaboration, giving access to temperatures between $286 {\\rm MeV} \\geq T\\geq 148{\\rm MeV}$. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinem...

  16. Reconstructing a multi-centennial drought history for the British Isles

    Science.gov (United States)

    Macdonald, Neil; Chiverrell, Richard; Todd, Beverley; Bowen, James; Lennard, Amy

    2016-04-01

    The last two decades have witnessed some of the most severe droughts experienced within living memory in the UK, but have these droughts really been exceptional? Relatively few instrumental river flow, groundwater or reservoir series extend beyond 50 years in length, with few precipitation series currently available extending over 100 years. These relatively short series present considerable challenges in determining current and future drought risk, with the results affecting society and the economy. This study uses long instrumental precipitation series coupled with the SPI and scPDSi drought indices to reconstruct drought histories from different parts of the British Isles. Existing long precipitation series have been reassessed and several new precipitation series reconstructed (e.g. Carlisle 1757), with eight series now over 200 years in length, and a further thirteen over 150 years, with further sites currently being developed (e.g. Norwich, 1749-; Exeter, 1724-). This study will focuses on the eight longest series, with shorter series used to help explore spatial and temporal variability across British Isles. We show how historical series have improved understanding of severe droughts, by examining past spatial and temporal variability and exploring the climatic drivers responsible. It shows that recent well documented droughts (e.g. 1976; 1996 and 2010) which have shaped both public and water resource managers' perceptions of risk, have historically been exceeded in both severity (e.g. 1781) and duration (e.g. 1798-1810); with the largest droughts often transcending single catchments and affecting regions. Recent droughts are not exceptional when considered within a multi-centennial timescale, with improved understanding of historical events raising concerns in contemporary water resource management.

  17. Denudation History and Paleogeographic Reconstruction of the Phanerozoic of southern Mantiqueira Province, Brazil

    Science.gov (United States)

    Jelinek, A. R.; Chemale, F., Jr.

    2012-12-01

    In this work we deal with the Phanerozoic history of the Southern Mantiqueira Province and adjacent areas after the orogen-collapse of the Brasiliano orogenic mountains in southern Brazil and Uruguay, based on thermocronological data (fission track and U-Th/He on apatite) and thermal history modelling. During the Paleozoic intraplate sedimentary basins formed mainly bordering the orogenic systems, and thus, these regions have not been overprinted by younger orogenic processes. In the Mesocenozoic this region was affected by later fragmentation and dispersal due to the separation of South America and Africa. Denudation history of both margins quantified on the basis of thermal history modeling of apatite fission track thermocronology indicates that the margin of southeastern Brazil and Uruguay presented a minimum 3.5 to 4.5 Km of denudation, which included the main exposure area of the Brasiliano orogenic belts and adjacent areas. The Phanerozoic evolution of the West Gondawana is thus recorded first by the orogenetic collapses of the Brasiliano and Pan-African belts, at that time formed a single mountain system in the Cambrian-Ordovician period. Subsequentlly, formed the intraplate basins as Paraná, in southeastern Brazil, and Congo and some records of the Table Mountains Group and upper section of Karoo units, in Southwestern Africa. In Permotriassic period, the collision of the Cape Fold Belt and Sierra de la Ventana Belt at the margins of the West Gondwana supercontinent resulted an elastic deformation in the cratonic areas, where the intraplate depositional basin occurred, and also subsidence and uplift of the already established Pan-African-Brasiliano Belts. Younger denudation events, due to continental margin uplift and basin subsidence, occurred during the rifting and dispersal of the South America and Africa plates, which can be very well defined by the integration of the passive-margin sedimentation of the Pelotas and Santos basins and apatite fission

  18. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    Energy Technology Data Exchange (ETDEWEB)

    Burnier, Yannis [Institut de Théorie des Phénomènes Physiques, Ecole Polytechnique Fédérale de Lausanne, CH-1015, Lausanne (Switzerland); Kaczmarek, Olaf [Fakultät für Physik, Universität Bielefeld, D-33615 Bielefeld (Germany); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, D-69120 Heidelberg (Germany)

    2016-01-22

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 32{sup 3} × N{sub τ} (β = 7.0, ξ = 3.5) lattices with N{sub τ} = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize N{sub f} = 2 + 1 ASQTAD lattices with m{sub l} = m{sub s}/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.

  19. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  20. Quantitative study on pollen-based reconstructions of vegetation history from central Canada

    Institute of Scientific and Technical Information of China (English)

    YU Ge; HART Catherina; VETTER Mary; SAUCHYN David

    2008-01-01

    Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W,40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem Ⅱ. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.

  1. Quantitative study on pollen-based reconstructions of vegetation history from central Canada

    Institute of Scientific and Technical Information of China (English)

    HART; Catherina; VETTER; Mary; SAUCHYN; David

    2008-01-01

    Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W, 40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem II. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.

  2. Bayesian probabilistic sensitivity analysis of Markov models for natural history of a disease: an application for cervical cancer

    Directory of Open Access Journals (Sweden)

    Giulia Carreras

    2012-09-01

    Full Text Available

    Background: parameter uncertainty in the Markov model’s description of a disease course was addressed. Probabilistic sensitivity analysis (PSA is now considered the only tool that properly permits parameter uncertainty’s examination. This consists in sampling values from the parameter’s probability distributions.

    Methods: Markov models fitted with microsimulation were considered and methods for carrying out a PSA on transition probabilities were studied. Two Bayesian solutions were developed: for each row of the modeled transition matrix the prior distribution was assumed as a product of Beta or a Dirichlet. The two solutions differ in the source of information: several different sources for each transition in the Beta approach and a single source for each transition from a given health state in the Dirichlet. The two methods were applied to a simple cervical cancer’s model.

    Results : differences between posterior estimates from the two methods were negligible. Results showed that the prior variability highly influence the posterior distribution.

    Conclusions: the novelty of this work is the Bayesian approach that integrates the two distributions with a product of Binomial distributions likelihood. Such methods could be also applied to cohort data and their application to more complex models could be useful and unique in the cervical cancer context, as well as in other disease modeling.

  3. A Bayesian Semiparametric Approach for Incorporating Longitudinal Information on Exposure History for Inference in Case-Control Studies

    OpenAIRE

    Bhadra, Dhiman; Daniels, Michael J.; Kim, Sungduk; Ghosh, Malay; Mukherjee, Bhramar

    2012-01-01

    In a typical case-control study, exposure information is collected at a single time-point for the cases and controls. However, case-control studies are often embedded in existing cohort studies containing a wealth of longitudinal exposure history on the participants. Recent medical studies have indicated that incorporating past exposure history, or a constructed summary measure of cumulative exposure derived from the past exposure history, when available, may lead to more precise and clinical...

  4. RECONSTRUCTING THE SOLAR WIND FROM ITS EARLY HISTORY TO CURRENT EPOCH

    Energy Technology Data Exchange (ETDEWEB)

    Airapetian, Vladimir S.; Usmanov, Arcadi V., E-mail: vladimir.airapetian@nasa.gov, E-mail: avusmanov@gmail.com [NASA Goddard Space Flight Center, Greenbelt, MD (United States)

    2016-02-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.

  5. Reconstructing the History of Mesoamerican Populations through the Study of the Mitochondrial DNA Control Region

    Science.gov (United States)

    Gorostiza, Amaya; Acunha-Alonzo, Víctor; Regalado-Liu, Lucía; Tirado, Sergio; Granados, Julio; Sámano, David; Rangel-Villalobos, Héctor; González-Martín, Antonio

    2012-01-01

    The study of genetic information can reveal a reconstruction of human population’s history. We sequenced the entire mtDNA control region (positions 16.024 to 576 following Cambridge Reference Sequence, CRS) of 605 individuals from seven Mesoamerican indigenous groups and one Aridoamerican from the Greater Southwest previously defined, all of them in present Mexico. Samples were collected directly from the indigenous populations, the application of an individual survey made it possible to remove related or with other origins samples. Diversity indices and demographic estimates were calculated. Also AMOVAs were calculated according to different criteria. An MDS plot, based on FST distances, was also built. We carried out the construction of individual networks for the four Amerindian haplogroups detected. Finally, barrier software was applied to detect genetic boundaries among populations. The results suggest: a common origin of the indigenous groups; a small degree of European admixture; and inter-ethnic gene flow. The process of Mesoamerica’s human settlement took place quickly influenced by the region’s orography, which development of genetic and cultural differences facilitated. We find the existence of genetic structure is related to the region’s geography, rather than to cultural parameters, such as language. The human population gradually became fragmented, though they remained relatively isolated, and differentiated due to small population sizes and different survival strategies. Genetic differences were detected between Aridoamerica and Mesoamerica, which can be subdivided into “East”, “Center”, “West” and “Southeast”. The fragmentation process occurred mainly during the Mesoamerican Pre-Classic period, with the Otomí being one of the oldest groups. With an increased number of populations studied adding previously published data, there is no change in the conclusions, although significant genetic heterogeneity can be detected in Pima

  6. Reconstructing the Solar Wind from Its Early History to Current Epoch

    Science.gov (United States)

    Airapetian, Vladimir S.; Usmanov, Arcadi V.

    2016-02-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.

  7. Significance of "stretched" mineral inclusions for reconstructing P- T exhumation history

    Science.gov (United States)

    Ashley, Kyle T.; Darling, Robert S.; Bodnar, Robert J.; Law, Richard D.

    2015-06-01

    Analysis of mineral inclusions in chemically and physically resistant hosts has proven to be valuable for reconstructing the P- T exhumation history of high-grade metamorphic rocks. The occurrence of cristobalite-bearing inclusions in garnets from Gore Mountain, New York, is unexpected because the peak metamorphic conditions reached are well removed (>600 °C too cold) from the stability field of this low-density silica polymorph that typically forms in high temperature volcanic environments. A previous study of samples from this area interpreted polymineralic inclusions consisting of cristobalite, albite and ilmenite as representing crystallized droplets of melt generated during a garnet-in reaction, followed by water loss from the inclusion to explain the reduction in inclusion pressure that drove the transformation of quartz to cristobalite. However, the recent discovery of monomineralic inclusions of cristobalite from the nearby Hooper Mine cannot be explained by this process. For these inclusions, we propose that the volume response to pressure and temperature changes during exhumation to Earth's surface resulted in large tensile stresses within the silica phase that would be sufficient to cause transformation to the low-density (low-pressure) form. Elastic modeling of other common inclusion-host systems suggests that this quartz-to-cristobalite example may not be a unique case. The aluminosilicate polymorph kyanite also has the capacity to retain tensile stresses if exhumed to Earth's surface after being trapped as an inclusion in plagioclase at P- T conditions within the kyanite stability field, with the stresses developed during exhumation sufficient to produce a transformation to andalusite. These results highlight the elastic environment that may arise during exhumation and provide a potential explanation of observed inclusions whose stability fields are well removed from P- T paths followed during exhumation.

  8. Past and current trends of change in a dune prairie/oak savanna reconstructed through a multiple-scale history

    Science.gov (United States)

    Cole, K.L.; Taylor, R.S.

    1995-01-01

    The history of a rapidly changing mosaic of prairie and oak savanna in northern Indiana was reconstructed using several methods emphasizing different time scales ranging from annual to millennial. Vegetation change was monitored for 8 yr using plots and for 30 yr using aerial photographs. A 20th century fire history was reconstructed from the stand structure of multiple-stemmed trees and fire scars. General Land Office Survey data were used to reconstruct the forest of A.D. 1834. Fossil pollen and charcoal records were used to reconstruct the last 4000 yr of vegetation and fire history. Since its deposition along the shore of Lake Michigan about 4000 yr ago, the area has followed a classical primary dune successional sequence, gradually changing from pine forest to prairie/oak savanna between A.D. 264 and 1007. This successional trend, predicted in the models of Henry Cowles, occurred even though the climate cooled and prairies elsewhere in the region retreated. Severe fires in the 19th century reduced most tree species but led to a temporary increase in Populus tremuloides. During the last few decades, the prairie has been invaded by oaks and other woody species, primarily because of fire suppression since A.D. 1972. The rapid and complex changes now occurring are a response to the compounded effects of plant succession, intense burning and logging in the 19th century, recent fire suppression, and possibly increased airborne deposition of nitrates. The compilation of several historical research techniques emphasizing different time scales allows this study of the interactions between multiple disturbance variables

  9. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    Energy Technology Data Exchange (ETDEWEB)

    Muir Wood, R. [EQE International Ltd (United Kingdom)

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.

  10. Reconstruction of paleostorm history using geochemical proxies in sediment cores from Eastern Lake, Florida

    Science.gov (United States)

    Das, O.; Wang, Y.; Donoghue, J. F.; Coor, J. L.; Kish, S.; Elsner, J.; Hu, X. B.; Niedoroda, A. W.; Ye, M.; Xu, Y.

    2009-12-01

    Analysis of geochemical proxies of coastal lake sediments provides a useful tool for reconstructing paleostorm history. Such paleostorm records can help constrain models that are used to predict future storm events. In this study, we collected two sediment cores (60 and 103 cm long, respectively) from the center of Eastern Lake located on the Gulf coast of NW Florida. These cores, which are mainly composed of organic-rich mud and organic-poor sand, were sub-sampled at 2-3mm intervals for analyses of their organic carbon and nitrogen concentrations as well as δ13C and δ15N isotopic signatures. Selected samples were submitted for radiocarbon dating in order to establish a chronological framework for the interpretation of the geochemical data. There are significant variations in δ13C, δ15N, C%, N% and C/N with depth. The δ13C and δ15N values vary from -21.8‰ to -26.7‰ and 2.6‰ to 5‰, respectively. The stable isotopic signatures of carbon and nitrogen indicate that the sources of organic matter in sediments include terrestrial C3 type vegetation, marine input from Gulf of Mexico and biological productivity within the lake, such as phytoplankton and zooplankton growing in the lacustrine environment. The δ13C and δ15N values exhibit significant negative excursions by 2‰ in a 30 cm thick sand layer, bounded by a rapid return to the base value. A positive shift in the δ15N record observed in the upper part of the cores likely reflects increased anthropogenic input of N such as sewage or septic tank effluents associated with recent development of areas around the lake for human habitation. Similarly, organic C% and N% range from 5.8 to 0.4 and 0.4 to 0.1, respectively. A prominent negative shift by 2σ relative to the baseline in C% and N% has been observed at approx. 55 to 58 cm depth, consisting of an organic-poor sand layer. This shift in C% and N% can be correlated with the negative shift in the δ13C and δ15N values, indicating a major storm event

  11. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    International Nuclear Information System (INIS)

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can most readily be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume 'push'. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10-9/year. Within the Baltic Shield long term strain rates have been around 10-11/year, with little evidence for evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently very little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the 'Current Tectonic Regime' is of Quaternary age although the orientation of the major stress axis has remained approximately consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than

  12. A unique opportunity to reconstruct the volcanic history of the island of Nevis, Lesser Antilles

    Science.gov (United States)

    Saginor, I.; Gazel, E.

    2012-12-01

    We report twelve new ICP-MS analyses and two 40Ar/39Ar ages for the Caribbean island of Nevis, located in the Lesser Antilles. These data show a very strong fractionation trend, suggesting that along strike variations may be primarily controlled by the interaction of rising magma with the upper plate. If this fractionation trend is shown to correlate with age, it may suggest that underplating of the crust is responsible for variations in the makeup of erupted lava over time, particularly with respect to silica content. We have recently been given permission to sample a series of cores being drilled by a geothermal company with the goal of reconstructing the volcanic history of the island. Drilling is often cost-prohibitive, making this a truly unique opportunity. Nevis has received little recent attention from researchers due to the fact that it has not been active for at least 100,000 years and also because of its proximity to the highly active Montserrat, which boasts its very own volcano observatory. However, there are a number of good reasons that make this region and Nevis in particular an ideal location for further analysis. First, and most importantly, is the access to thousands of meters of drill cores that is being provided by a local geothermal company. Second, a robust earthquake catalog exists (Bengoubou-Valerius et al., 2008), so the dip and depth to the subducting slab is well known. These are fundamental parameters that influence the mechanics of a subduction zone, therefore it would be difficult to proceed if they were poorly constrained. Third, prior sampling of Nevis has been limited since Hutton and Nockolds (1978) published the only extensive petrologic study ever performed on the island. This paper contained only 43 geochemical analyses and 6 K-Ar ages, which are less reliable than more modern Ar-Ar ages. Subsequent studies tended to focus on water geochemistry (GeothermEx, 2005), geothermal potential (Geotermica Italiana, 1992; Huttrer, 1998

  13. Holocene local forest history at two sites in Småland, southern Sweden - insights from quantitative reconstructions using the Landscape Reconstruction Algorithm

    Science.gov (United States)

    Cui, Qiaoyu; Gaillard, Marie-José; Lemdahl, Geoffrey; Olsson, Fredrik; Sugita, Shinya

    2010-05-01

    Quantitative reconstruction of past vegetation using fossil pollen was long very problematic. It is well known that pollen percentages and pollen accumulation rates do not represent vegetation abundance properly because pollen values are influenced by many factors of which inter-taxonomic differences in pollen productivity and vegetation structure are the most important ones. It is also recognized that pollen assemblages from large sites (lakes or bogs) record the characteristics of the regional vegetation, while pollen assemblages from small sites record local features. Based on the theoretical understanding of the factors and mechanisms that affect pollen representation of vegetation, Sugita (2007a and b) proposed the Landscape Reconstruction Algorithm (LRA) to estimate vegetation abundance in percentage cover for well defined spatial scales. The LRA includes two models, REVEALS and LOVE. REVEALS estimates regional vegetation abundance at a spatial scale of 100 km x 100 km. LOVE estimates local vegetation abundance at the spatial scale of the relevant source area of pollen (RSAP sensu Sugita 1993) of the pollen site. REVEALS estimates are needed to apply LOVE in order to calculate the RSAP and the vegetation cover within the RSAP. The two models were validated theoretically and empirically. Two small bogs in southern Sweden were studied for pollen, plant macrofossil, charcoal, and coleoptera in order to reconstruct the local Holocene forest and fire history (e.g. Greisman and Gaillard 2009; Olsson et al. 2009). We applied the LOVE model in order to 1) compare the LOVE estimates with pollen percentages for a better understanding of the local forest history; 2) obtain more precise information on the local vegetation to explain between-sites differences in fire history. We used pollen records from two large lakes in Småland to obtain REVEALS estimates for twelve continuous 500-yrs time windows. Following the strategy of the Swedish VR LANDCLIM project (see Gaillard

  14. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2012-01-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…

  15. Contact-induced change in Dolgan : an investigation into the role of linguistic data for the reconstruction of a people's (pre)history

    NARCIS (Netherlands)

    Stapert, Eugénie

    2013-01-01

    This study explores the role of linguistic data in the reconstruction of Dolgan (pre)history. While most ethno-linguistic groups have a longstanding history and a clear ethnic and linguistic affiliation, the formation of the Dolgans has been a relatively recent development, and their ethnic origins

  16. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  17. Reconstruction of bomb 14C time history recorded in the stalagmite from Postojna Cave

    International Nuclear Information System (INIS)

    The karstic caves provide valuable resources for reconstruction of environmental conditions on the continent in the past. This is possible due to the great stability of climatic conditions within a cave. Secondary minerals deposited in caves, known as speleothems, preserve records of long-term climatic and environmental changes at the site of their deposition and in the vicinity. The purity of speleothems and their chemical and physical stability make them exceptionally well suited for detailed geochemical and isotopic analysis

  18. Bayesian statistics

    OpenAIRE

    Draper, D.

    2001-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  19. Between history and cultural psychology: Some reflections on mediation and normativity when reconstructing the past

    DEFF Research Database (Denmark)

    Brescó, Ignacio

    2016-01-01

    a reflection on these two issues. It begins by highlighting the contribution of psychology and history, as emerging disciplines in the 19th Century, to the creation of a normative framework for the subject of modernity according to the needs of modern nation states. It also alludes to both disciplines’ common...... pursuit of a reference point in natural science in order to achieve a status that is on a par with the latter’s. It is argued that this resulted in an objectivist stance that equates the study of memory and history with an accurate reproduction of the past, thus concealing the mediated nature of past...

  20. Genetic Structuration, Demography and Evolutionary History of Mycobacterium tuberculosis LAM9 Sublineage in the Americas as Two Distinct Subpopulations Revealed by Bayesian Analyses.

    Directory of Open Access Journals (Sweden)

    Yann Reynaud

    Full Text Available Tuberculosis (TB remains broadly present in the Americas despite intense global efforts for its control and elimination. Starting from a large dataset comprising spoligotyping (n = 21183 isolates and 12-loci MIRU-VNTRs data (n = 4022 isolates from a total of 31 countries of the Americas (data extracted from the SITVIT2 database, this study aimed to get an overview of lineages circulating in the Americas. A total of 17119 (80.8% strains belonged to the Euro-American lineage 4, among which the most predominant genotypic family belonged to the Latin American and Mediterranean (LAM lineage (n = 6386, 30.1% of strains. By combining classical phylogenetic analyses and Bayesian approaches, this study revealed for the first time a clear genetic structuration of LAM9 sublineage into two subpopulations named LAM9C1 and LAM9C2, with distinct genetic characteristics. LAM9C1 was predominant in Chile, Colombia and USA, while LAM9C2 was predominant in Brazil, Dominican Republic, Guadeloupe and French Guiana. Globally, LAM9C2 was characterized by higher allelic richness as compared to LAM9C1 isolates. Moreover, LAM9C2 sublineage appeared to expand close to twenty times more than LAM9C1 and showed older traces of expansion. Interestingly, a significant proportion of LAM9C2 isolates presented typical signature of ancestral LAM-RDRio MIRU-VNTR type (224226153321. Further studies based on Whole Genome Sequencing of LAM strains will provide the needed resolution to decipher the biogeographical structure and evolutionary history of this successful family.

  1. Reconstructions of subducted ocean floor along the Andes: a framework for assessing Magmatic and Ore Deposit History

    Science.gov (United States)

    Sdrolias, M.; Müller, R.

    2006-05-01

    The South American-Antarctic margin has been characterised by numerous episodes of volcanic arc activity and ore deposit formation throughout much of the Mesozoic and Cenozoic. Although its Cenozoic subduction history is relatively well known, placing the Mesozoic arc-related volcanics and the emplacement of ore bodies in their plate tectonic context remains poorly constrained. We use a merged moving hotspot (Late Cretaceous- present) and palaeomagnetic /fixed hotspot (Early Cretaceous) reference frame, coupled with reconstructed spreading histories of the Pacific, Phoenix and Farallon plates to understand the convergence history of the South American and Antarctic margins. We compute the age-area distribution of oceanic lithosphere through time, including subducting oceanic lithosphere and estimate convergence rates along the margin. Additionally, we map the location and migration of spreading ridges along the margin and relate this to processes on the overriding plate. The South American-Antarctic margin in the late Jurassic-early Cretaceous was dominated by rapid convergence, the subduction of relatively young oceanic lithosphere (Rocas Verdes" in southern South America. The speed of subduction increased again along the South American-Antarctic margin at ~105 Ma after another change in tectonic regime. Newly created crust from the Farallon-Phoenix ridge continued to be subducted along southern South America until the cessation of the Farallon-Phoenix ridge in the latest Cretaceous / beginning of the Cenozoic. The age of the subducting oceanic lithosphere along the South American-Antarctic margin has increased steadily through time.

  2. Literacy Models and the Reconstruction of History Education: A Comparative Discourse Analysis of Two Lesson Plans

    Science.gov (United States)

    Collin, Ross; Reich, Gabriel A.

    2015-01-01

    This article presents discourse analyses of two lesson plans designed for secondary school history classes. Although the plans focus on the same topic, they rely on different models of content area literacy: disciplinary literacy, or reading and writing like experts in a given domain, and critical literacy, or reading and writing to address…

  3. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2011-04-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth centuries. This paper deals with our experiences in restaging the decomposition of water as part of a history of science course at the Norwegian University of Science and Technology, Trondheim, Norway. For the experiment we used an apparatus from our historical teaching collection and built a replica of a voltaic pile. We also traced the uses and meanings of decomposition of water within science and science teaching in schools and higher education in local institutions. Building the pile, and carrying out the experiments, held a few surprises that we did not anticipate through our study of written sources. The exercise gave us valuable insight into the nature of the devices and the experiment, and our students appreciated an experience of a different kind in a history of science course.

  4. Reconstruction of burial history, temperature, source rock maturity and hydrocarbon generation in the northwestern Dutch offshore

    NARCIS (Netherlands)

    Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten

    2012-01-01

    3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the

  5. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    Energy Technology Data Exchange (ETDEWEB)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)

    1997-09-01

    {sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.

  6. ORIGIN OF SURGERY: A HISTORY OF EXPLORATION OF PLASTIC AND RECONSTRUCTIVE SURGERY

    Directory of Open Access Journals (Sweden)

    Panigrahi

    2013-10-01

    Full Text Available The Sushruta Samhita, Charaka samhita and Astanga sangraha are the tri compendia of Ayurveda encompassing all the aspects of Ayurveda. Plastic surgery (Sandhana karma is a very old branch of surgery described since Vedic era. Almost all the Samhitas described about the methods of Sandhana Karma (Plastic and reconstructive surgery. Now a day the world recognizes the pioneering nature of Sushruta’s plastic and reconstructive surgery and Sushruta is regarded as Father of Plastic surgery. The plastic operations of ear (otoplasty and Rhinoplasty (Plastic Surgery of nose are described in the 16th Chapter of first book (Sutrasthan of the compendium. First methods are described for piercing the earlobes of an infant which still is a widespread practice in India. Sushruta has described 15 methods of joining these cup-up ear lobes. For this plastic operation called karna bedha, a piece of skin was taken from the cheek, turned back, and suitably stitches the lobules. Sushruta has also described Rhinoplasty (Nasa Sandhana. Portion of the nose to be covered should be first measured with a leaf. Then a piece of skin of the required size should be dissected from the living skin of the cheek and turned back to cover the nose, keeping a small pedicle attaches to the cheek. The part of nose to which the skin is to be attached should be made raw by cutting of the nasal stump with a knife. The surgeon then should place the skin on the nose and stitch the two parts swiftly, keeping the skin properly elevated by inserting two tubes of eranda (castor oil plant in each part of nostril so that the new nose gets proper shape. It should then be sprinkled with a powder composed of liquorices, red sandalwood, and barberry plant. Finally it should be covered with cotton and clean sesame oil should be constantly applied to it. Sushruta also mentioned the reconstruction of the broken lip and hare-lip (ostha Sandhana.These descriptions are brief in nature in comparison to

  7. Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances

    CERN Document Server

    Snaith, O; Di Matteo, P; Lehnert, M D; Combes, F; Katz, D; Gómez, A

    2014-01-01

    We develop a chemical evolution model in order to study the star formation history of the Milky Way. Our model assumes that the Milky Way is formed from a closed box-like system in the inner regions, while the outer parts of the disc experience some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) in order to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age in order to recover the star formation history of the Galaxy. Our method enables one to recover with unprecedented accuracy the star formation history of the Milky Way in the first Gyrs, in both the inner (R9-10kpc) discs as sampled in the solar vicinity. We show that, in the inner disc, half of the stellar mass formed during the thick disc phase, in the first 4-5 Gyr. This phase was followed by a significant dip in the star formation activity (at 8-9 Gyr) and a period of roughly constant lower level star formation for the remaining 8 Gyr. The thick disc phase ha...

  8. Probing the Expansion history of the Universe by Model-Independent Reconstruction from Supernovae and Gamma-Ray Bursts Measurements

    CERN Document Server

    Feng, Chao-Jun

    2016-01-01

    To probe the late evolution history of the Universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis (PCA) and the other is build by taking the multidimensional scaling (MDS) approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis are optimized for different kinds of cosmological models that based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that projected from the basis systems is cosmological model independent, and it provide a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray bursts (GRBs) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt (LM) technique and the Markov Chain Monte Carlo (MCMC) method, we perform an ...

  9. Reconstructing the Solar Wind From Its Early History To Current Epoch

    CERN Document Server

    Airapetian, Vladimir S

    2016-01-01

    Stellar winds from active solar type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass loss rates cannot be directly derived from observations. We employed a three dimensional magnetohydrodynamic Alfven wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass loss rate, terminal velocity and wind temperature at 0.7, 2 and 4.65 Gyr. Our model treats the wind thermal electrons, protons and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfven wave amplitude and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constra...

  10. Signs of malnutrition and starvation--reconstruction of nutritional life histories by serial isotopic analyses of hair.

    Science.gov (United States)

    Neuberger, Ferdinand M; Jopp, Eilin; Graw, Matthias; Püschel, Klaus; Grupe, Gisela

    2013-03-10

    The diagnosis of starvation in children or adults is an important topic in paediatric and geriatric medicine, and in law assessment. To date, few reliable techniques are available to reconstruct the onset and duration of undernourishment, especially in cases of wilful neglect or abuse. The intention of this research project is to introduce a method based on isotopic analysis to reconstruct nutritional life histories and to detect starvation. For this purpose the specific signature of stable carbon and nitrogen isotopes in human hair samples is investigated and measured in the course of serious nutritional deprivation. Previous study of our research group on anorectic patients has shown that incremental hair analyses can monitor the individual nutritional status of each patient. Increasing δ(15)N-values indicate the catabolism of bodily protein and are associated with a very low BMI. In contrast, the changes of the δ(13)C values and BMI were in phase, which can be linked to the lack of energy in the consumed diet and the break down of body fat deposits. These findings were now applied to various forensic cases, in which severe starvation occurred recently prior to death. We are aiming at establishing an unbiased biomarker to identify the individual timeframe of nutritional deprivation to detect and prevent starvation.

  11. Demographic History of the Genus Pan Inferred from Whole Mitochondrial Genome Reconstructions

    Science.gov (United States)

    Tucci, Serena; de Manuel, Marc; Ghirotto, Silvia; Benazzo, Andrea; Prado-Martinez, Javier; Lorente-Galdos, Belen; Nam, Kiwoong; Dabad, Marc; Hernandez-Rodriguez, Jessica; Comas, David; Navarro, Arcadi; Schierup, Mikkel H.; Andres, Aida M.; Barbujani, Guido; Hvilsom, Christina; Marques-Bonet, Tomas

    2016-01-01

    The genus Pan is the closest genus to our own and it includes two species, Pan paniscus (bonobos) and Pan troglodytes (chimpanzees). The later is constituted by four subspecies, all highly endangered. The study of the Pan genera has been incessantly complicated by the intricate relationship among subspecies and the statistical limitations imposed by the reduced number of samples or genomic markers analyzed. Here, we present a new method to reconstruct complete mitochondrial genomes (mitogenomes) from whole genome shotgun (WGS) datasets, mtArchitect, showing that its reconstructions are highly accurate and consistent with long-range PCR mitogenomes. We used this approach to build the mitochondrial genomes of 20 newly sequenced samples which, together with available genomes, allowed us to analyze the hitherto most complete Pan mitochondrial genome dataset including 156 chimpanzee and 44 bonobo individuals, with a proportional contribution from all chimpanzee subspecies. We estimated the separation time between chimpanzees and bonobos around 1.15 million years ago (Mya) [0.81–1.49]. Further, we found that under the most probable genealogical model the two clades of chimpanzees, Western + Nigeria-Cameroon and Central + Eastern, separated at 0.59 Mya [0.41–0.78] with further internal separations at 0.32 Mya [0.22–0.43] and 0.16 Mya [0.17–0.34], respectively. Finally, for a subset of our samples, we compared nuclear versus mitochondrial genomes and we found that chimpanzee subspecies have different patterns of nuclear and mitochondrial diversity, which could be a result of either processes affecting the mitochondrial genome, such as hitchhiking or background selection, or a result of population dynamics. PMID:27345955

  12. Reconstructing the Accretion History of the Galactic Halo Using Stellar Chemical Abundance Ratio Distributions

    Science.gov (United States)

    Lee, Duane M.; Johnston, Kathryn V.; Sen, Bodhisattva; Jessop, Will

    2016-08-01

    In this study we tested the prospects of using 2D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from eleven ``MW-like'' halos to generate satellite template sets of 2D CARDs of accreted dwarf satellites which are comprised of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of ~ 103-4 mock observations of stellar chemical abundance ratios ([α/Fe], [Fe/H]) from those eleven halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion histories in relation to the satellite template set (STS) used and the sample size. For certain STS used we typically can identify the relative mass contributions of all accreted satellites to within a factor of 2. We also find that this method is particularly sensitive to older accretion events involving low-luminous dwarfs e.g. ultra-faint dwarfs - precisely those events that are too ancient to be seen by phase-space studies of stars and too faint to be seen by high-z studies of the early Universe. Since our results only exploit two chemical dimensions and near-future surveys promise to provide ~ 6-9 dimensions, we conclude that these new high-resolution spectroscopic surveys of the stellar halo will allow us (given the development of new CARD-generating dwarf models) to recover the luminosity function of infalling dwarf galaxies - and the detailed accretion history of the halo - across cosmic time.

  13. A Blacker and Browner Shade of Pale: Reconstructing Punk Rock History

    OpenAIRE

    Pietschmann, Franziska

    2010-01-01

    Embedded in the transatlantic history of rock ‘n’ roll, punk rock has not only been regarded as a watershed moment in terms of music, aesthetics and music-related cultural practices, it has also been perceived as a subversive white cultural phenomenon. A Blacker and Browner Shade of Pale challenges this widespread and shortsighted assumption. People of color, particularly black Americans and Britons, and Latina/os have pro-actively contributed to punk’s evolution and shaped punk music culture...

  14. Distance-Based Phylogeny Reconstruction: Safety and Edge Radius

    OpenAIRE

    Gascuel, Olivier; Pardi, Fabio; Truszkowski, Jakub

    2015-01-01

    International audience; A phylogeny is an evolutionary tree tracing the shared history, including common ancestors, of a set of extant species or “taxa”. Phylogenies are increasingly reconstructed on the basis of molecular data (DNA and protein sequences) using statistical techniques such as likelihood and Bayesian methods. Algorithmically, these techniques suffer from the discrete nature of tree topology space. Since the number of tree topologies increases exponentially as a function of the ...

  15. Reconstructing and analyzing China's fifty-nine year (1951–2009 drought history using hydrological model simulation

    Directory of Open Access Journals (Sweden)

    Z. Y. Wu

    2011-09-01

    Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North

  16. Reconstructing and analyzing China's fifty-nine year (1951–2009 drought history using hydrological model simulation

    Directory of Open Access Journals (Sweden)

    Z. Y. Wu

    2011-02-01

    Full Text Available The recent fifty-nine year (1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We develop a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As the result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, indicating SMAPI is capable of indentifying the onset of a drought event, its progressing, as well as its ending. Spatial and temporal variations of droughts on China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions Inner Mongolia, Northeast and North during the recent fifty-nine years. However, the decadal variability of droughts has been week in the rest five regions South, Southwest, East, Northwest, and Tibet. Xinjiang has even been wetting steadily since the 1950s. Two regional dry centers are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first center is located in the area partially covered by two

  17. Testing sex and gender in sports; reinventing, reimagining and reconstructing histories.

    Science.gov (United States)

    Heggie, Vanessa

    2010-12-01

    Most international sports organisations work on the premise that human beings come in one of two genders: male or female. Consequently, all athletes, including intersex and transgender individuals, must be assigned to compete in one or other category. Since the 1930s (not, as is popularly suggested, the 1960s) these organisations have relied on scientific and medical professionals to provide an 'objective' judgement of an athlete's eligibility to compete in women's national and international sporting events. The changing nature of these judgements reflects a great deal about our cultural, social and national prejudices, while the matter of testing itself has become a site of conflict for feminists and human rights activists. Because of the sensitive nature of this subject, histories of sex testing are difficult to write and research; this has lead to the repetition of inaccurate information and false assertions about gender fraud, particularly in relation to the 'classic' cases of Stella Walsh and Heinrich/Hermann/Dora Ratjen. As historians, we need to be extremely careful to differentiate between mythologies and histories. PMID:20980057

  18. Reconstructing Jewish Identity on the Foundations of Hellenistic History: Azariah de' Rossi's Me'or 'Enayim in Late 16th Century Northern Italy

    OpenAIRE

    Rosenberg-Wohl, David Michael

    2014-01-01

    AbstractReconstructing Jewish Identity on the Foundations of Hellenistic History:Azariah de' Rossi's Me'or `Enayim in Late 16th Century Northern ItalybyDavid Michael Rosenberg-WohlDoctor of Philosophy in Jewish Studiesandthe Graduate Theological UnionProfessor Erich S. Gruen, ChairMe'or `Enayim is conventionally considered to be early modern Jewish history. Recent scholarship tends to consider the work Renaissance historiography, Counter-Reformation apology or some combination of the two. The...

  19. Reconstructing the evolutionary history of China: a caveat about inferences drawn from ancient DNA.

    Science.gov (United States)

    Yao, Yong-Gang; Kong, Qing-Peng; Man, Xiao-Yong; Bandelt, Hans-Jürgen; Zhang, Ya-Ping

    2003-02-01

    The decipherment of the meager information provided by short fragments of ancient mitochondrial DNA (mtDNA) is notoriously difficult but is regarded as a most promising way toward reconstructing the past from the genetic perspective. By haplogroup-specific hypervariable segment (HVS) motif search and matching or near-matching with available modern data sets, most of the ancient mtDNAs can be tentatively assigned to haplogroups, which are often subcontinent specific. Further typing for mtDNA haplogroup-diagnostic coding region polymorphisms, however, is indispensable for establishing the geographic/genetic affinities of ancient samples with less ambiguity. In the present study, we sequenced a fragment (approximately 982 bp) of the mtDNA control region in 76 Han individuals from Taian, Shandong, China, and we combined these data with previously reported samples from Zibo and Qingdao, Shandong. The reanalysis of two previously published ancient mtDNA population data sets from Linzi (same province) then indicates that the ancient populations had features in common with the modern populations from south China rather than any specific affinity to the European mtDNA pool. Our results highlight that ancient mtDNA data obtained under different sampling schemes and subject to potential contamination can easily create the impression of drastic spatiotemporal changes in the genetic structure of a regional population during the past few thousand years if inappropriate methods of data analysis are employed.

  20. Regional reconstruction of flash flood history in the Guadarrama range (Central System, Spain).

    Science.gov (United States)

    Rodriguez-Morata, C; Ballesteros-Cánovas, J A; Trappmann, D; Beniston, M; Stoffel, M

    2016-04-15

    Flash floods are a common natural hazard in Mediterranean mountain environments and responsible for serious economic and human disasters. The study of flash flood dynamics and their triggers is a key issue; however, the retrieval of historical data is often limited in mountain regions as a result of short time series and the systematic lack of historical data. In this study, we attempt to overcome data deficiency by supplementing existing records with dendrogeomorphic techniques which were employed in seven mountain streams along the northern slopes of the Guadarrama Mountain range. Here we present results derived from the tree-ring analysis of 117 samples from 63 Pinus sylvestris L. trees injured by flash floods, to complement existing flash flood records covering the last ~200years and comment on their hydro-meteorological triggers. To understand the varying number of reconstructed flash flood events in each of the catchments, we also performed a comparative analysis of geomorphic catchment characteristics, land use evolution and forest management. Furthermore, we discuss the limitations of dendrogeomorphic techniques applied in managed forests. PMID:26845178

  1. Reconstruction of Disturbance History in Naples Bay, Florida: A Combined Radiometric/Geochemical Approach

    Science.gov (United States)

    van Eaton, A. R.; Zimmerman, A.; Brenner, M.; Kenney, W.; Jaeger, J. M.

    2006-12-01

    Historical reconstructions of aquatic systems have commonly depended on short-lived radioisotopes (e.g. Pb- 210 and Cs-137) to provide a temporal framework for disturbances over the past 100 years. However, applications of these radiotracers to highly variable systems such as estuaries are often problematic. Hydrologic systems prone to rapid shifts in sediment composition and grain size distribution may yield low and erratic isotopic activities with depth in sediment. Additionally, the marine influence on coastal systems and preferential adsorption of radionuclides by organic matter may violate assumptions of the CIC and CRS dating models. Whereas these sediment cores are often deemed "undateable", we propose a modeling technique that accounts for textural and compositional variation, providing insight into the depositional patterns and disturbance records of these dynamic environments. Here, the technique is applied to sediment cores collected from five regions of Naples Bay estuary in southwest Florida. The significant positive correlation between excess Pb-210 activities and organic matter content in each core provides evidence for strong lithologic control on radioisotope scavenging, supporting the use of organic matter- normalized excess Pb-210 activity profiles when modeling sediment accumulation rates in predominantly sandy estuaries. Using this approach, episodes of increased sedimentation rate were established that correspond to periods of heightened anthropogenic disturbance (canal dredging and development) in the Naples Bay watershed during the mid- 1900's.

  2. Fiction as Reconstruction of History: Narratives of the Civil War in American Literature

    Directory of Open Access Journals (Sweden)

    Reinhard Isensee

    2009-09-01

    Full Text Available Even after more than 140 years the American Civil War continues to serve as a major source of inspiration for a plethora of literature in various genres. While only amounting to a brief period in American history in terms of years, this war has proved to be one of the central moments for defining the American nation since the second half of the nineteenth century. The facets of the Civil War, its protagonists, places, events, and political, social and cultural underpinnings seem to hold an ongoing fascination for both academic studies and fictional representations. Thus, it has been considered by many the most written-about war in the United States.

  3. Merged or monolithic? Using machine-learning to reconstruct the dynamical history of simulated star clusters

    CERN Document Server

    Pasquato, Mario

    2016-01-01

    Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After dimensionality reduction on the feature space, the resulting datapoints are fed to various classification algorithms. Using repeated random subsampling validation we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results. The three algorithms we considered (C5.0 tree...

  4. A Reconstruction of Development of the Periodic Table Based on History and Philosophy of Science and Its Implications for General Chemistry Textbooks

    Science.gov (United States)

    Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor

    2005-01-01

    The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…

  5. Merged or monolithic? Using machine-learning to reconstruct the dynamical history of simulated star clusters

    Science.gov (United States)

    Pasquato, Mario; Chung, Chul

    2016-05-01

    Context. Machine-learning (ML) solves problems by learning patterns from data with limited or no human guidance. In astronomy, ML is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims: We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify globular clusters (GCs) that may have a history of merging from observational data. Methods: We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After carrying out dimensionality reduction on the feature space, the resulting datapoints are fed in to various classification algorithms. Using repeated random subsampling validation, we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results: The three algorithms we considered (C5.0 trees, k-nearest neighbour, and support-vector machines) all achieve a test misclassification rate of about 10% without parameter tuning, with support-vector machines slightly outperforming the others. The first principal component of feature space correlates with cluster concentration. If we exclude it from the regression, the performance of the algorithms is only slightly reduced.

  6. Research Progress on Reconstruction of Paleofire History%古火灾历史重建的研究进展

    Institute of Scientific and Technical Information of China (English)

    占长林; 曹军骥; 韩永明; 安芷生

    2011-01-01

    火是地球系统的重要组成部分,与气候、植被、生物地球化学循环和人类活动密切相关。火对全球气候和生态系统的影响已成为目前全球变化研究的一个热点。火灾发生后会在周围的环境中留下许多燃烧产物,如黑碳、木炭屑、多环芳烃、左旋葡萄糖等,它们广泛存在于海洋、湖泊、河流、土壤和陆地风成沉积物中;还会留下一些火灾痕迹,如树木火疤、土壤磁学参数的改变。通过这些记录不仅可以反映火活动的历史,同时能够反映历史时期的气候条件及植被格局,为火灾历史重建提供宝贵的信息资料。主要总结目前国内外火灾历史重建中常用的一些替代性指标及存在的问题,同时指出古火历史重建未来的发展方向。虽然不同的替代性指标在以往的研究中都得到了成功应用,但是不同的记录由于受人为或生物扰动的影响,反映的时间及空间尺度上的差异性,使这些指标在重建火灾历史中都存在一定的局限性,不利于正确理解火灾与人类活动、气候变化和植被之间的相互关系。%Fire is an important part Of the Earth system. It is tightly coupled with climate, vegetation, biogeochemical cycles and human activities. The influences of fire on the global climate and ecosystems have become a hot topic in global change research. After the fire, many combustion products remain in the surrounding environment, such as black carbon, charcoal, Polycyclic aromatic hydrocarbons and levoglucosan, which are widely found in oceans, lakes, rivers, soils and terrestrial eolian sediments. Moreover, some traces like fire scars of trees, variation of soil magnetic parameters also remain in these records. These records not only reflected the history of paleofire activity, but also the climatic conditions and vegetation patterns in historical period, so as to provide clues for reconstruction of paleofire and

  7. From subduction to collision: constraining the early history of the Taiwan Mountain Belt by plate tectonic reconstructions

    Science.gov (United States)

    von Hagke, Christoph; Philippon, Mélody; Avouac, Jean-Philippe

    2014-05-01

    Understanding formation of the Taiwan orogen is important, because it is an active case-example to test geodynamic theories of mountain building processes, such as the critical wedge model, or of subduction zone reversal. Nevertheless, large uncertainties exist regarding the pre-collisional architecture of the orogen, timing of collision, as well as peak metamorphic conditions of the Cenozoic orogeny. The goal of this contribution is to re-evaluate existing models in the light of recent geophysical datasets, and constrain the evolution towards the present day plate tectonic configuration with a comprehensive reconstruction of plate movements since the Late Cretaceous. To this end, we present a revised analysis of the plate tectonic framework of Southeast Asia since the Late Cretaceous, a time when subduction polarity was still opposite to what is observed at present (westward subduction of the Pacific Plate, as opposed to eastward subduction of Eurasia at present). This is independent of the subduction zone reversal thought to occur at present in the northern part of the Taiwan orogen. We place our reconstructions within a global plate tectonic frame, and discuss (1) the consequences of subduction zone reversal for the evolving passive margin, (2) the influence of opening on the (proto-) South China Sea on the pre-collisional architecture. This yields a new model for the collisional history of Taiwan, which reconciles the pre-collisional architecture with the metamorphic conditions of the Cenozoic orogeny, and makes predictions about timing of peak-pressures, as well as the timing of collision and present subduction zone reversal.

  8. The reconstructive study in arcaheology: case histories in the communication issues

    Directory of Open Access Journals (Sweden)

    Francesco Gabellone

    2011-09-01

    Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l

  9. Reconstruction of multi-century flood histories from oxbow lake sediments, Peace-Athabasca Delta, Canada

    Science.gov (United States)

    Wolfe, Brent B.; Hall, Roland I.; Last, William M.; Edwards, Thomas W. D.; English, Michael C.; Karst-Riddoch, Tammy L.; Paterson, Andrew; Palmini, Roger

    2006-12-01

    Floods caused by ice-jams on the Peace River are considered to be important for maintaining hydro-ecological conditions of perched basins in the Peace-Athabasca Delta (PAD), Canada, a highly productive and internationally recognized northern boreal ecosystem. Concerns over the potential linkages between regulation of the Peace River in 1968 for hydroelectric production and low Peace River discharge between 1968 and 1971 during the filling of the hydroelectric reservoir, absence of a major ice-jam flood event between 1975 and 1995, and low water levels in perched basins during the 1980s and early 1990s have sparked numerous environmental studies largely aimed at restoring water levels in the PAD. Lack of sufficient long-term hydrological records, however, has limited the ability to objectively assess the importance of anthropogenic factors versus natural climatic forcing in regulating hydro-ecological conditions of the PAD. Here, we report results of a paleolimnological study on laminated sediments from two oxbow lakes in the PAD, which are located adjacent to major flood distributaries of the Peace River. Sediment core magnetic susceptibility measurements, supported by results from several other physical and geochemical analyses as well as stratigraphic correspondence with recorded high-water events on the Peace River, provide proxy records of flood history spanning the past 180 and 300 years in these two basins. Results indicate that inferred flood frequency has been highly variable over the past 300 years but in decline for many decades beginning as early as the late nineteenth century, well before Peace River regulation. Additionally, several multi-decadal intervals without a major flood have occurred during the past 300 years. While climate-related mechanisms responsible for this variability in flood frequency remain to be determined, as does quantifying the relative roles of river regulation and climate variability on hydro-ecological conditions in the PAD

  10. Late Quaternary vegetation, fire and climate history reconstructed from two cores at Cerro Toledo, Podocarpus National Park, southeastern Ecuadorian Andes

    Science.gov (United States)

    Brunschön, Corinna; Behling, Hermann

    2009-11-01

    The last ca. 20,000 yr of palaeoenvironmental conditions in Podocarpus National Park in the southeastern Ecuadorian Andes have been reconstructed from two pollen records from Cerro Toledo (04°22'28.6"S, 79°06'41.5"W) at 3150 m and 3110 m elevation. Páramo vegetation with high proportions of Plantago rigida characterised the last glacial maximum (LGM), reflecting cold and wet conditions. The upper forest line was at markedly lower elevations than present. After ca. 16,200 cal yr BP, páramo vegetation decreased slightly while mountain rainforest developed, suggesting rising temperatures. The trend of increasing temperatures and mountain rainforest expansion continued until ca. 8500 cal yr BP, while highest temperatures probably occurred from 9300 to 8500 cal yr BP. From ca. 8500 cal yr BP, páramo vegetation re-expanded with dominance of Poaceae, suggesting a change to cooler conditions. During the late Holocene after ca. 1800 cal yr BP, a decrease in páramo indicates a change to warmer conditions. Anthropogenic impact near the study site is indicated for times after 2300 cal yr BP. The regional environmental history indicates that through time the eastern Andean Cordillera in South Ecuador was influenced by eastern Amazonian climates rather than western Pacific climates.

  11. Probing the Expansion History of the Universe by Model-independent Reconstruction from Supernovae and Gamma-Ray Burst Measurements

    Science.gov (United States)

    Feng, Chao-Jun; Li, Xin-Zhou

    2016-04-01

    To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.

  12. Characterizing and Communicating Risk with Exposure Reconstruction and Bayesian Analysis: Historical Locomotive Maintenance/Repair Associated with Asbestos Woven Tape Pipe Lagging.

    Science.gov (United States)

    Boelter, Fred W; Persky, Jacob D; Podraza, Daniel M; Bullock, William H

    2016-02-01

    Our reconstructed historical work scenarios incorporating a vintage 1950s locomotive can assist in better understanding the historical asbestos exposures associated with past maintenance and repairs and fill a literature data gap. Air sampling data collected during the exposure scenarios and analyzed by NIOSH 7400 (PCM) and 7402 (PCME) methodologies show personal breathing zone asbestiform fiber exposures were below the current OSHA exposure limits for the eight-hour TWA permissible exposure limit (PEL) of 0.1 f/cc (range woven tape lagging that may have been chrysotile asbestos and handled, removed, and reinstalled during repair and maintenance activities. We reconstructed historical work scenarios containing asbestos woven tape pipe lagging that have not been characterized in the published literature. The historical work scenarios were conducted by a retired railroad pipefitter with 37 years of experience working with materials and locomotives. PMID:26255644

  13. Extracting information from previous full-dose CT scan for knowledge-based Bayesian reconstruction of current low-dose CT images

    OpenAIRE

    Zhang, Hao; Han, Hao; Liang, Zhengrong; Hu, Yifan; Liu, Yan; Moore, William; Ma, Jianhua; Lu, Hongbing

    2015-01-01

    Markov random field (MRF) model has been widely employed in edge-preserving regional noise smoothing penalty to reconstruct piece-wise smooth images in the presence of noise, such as in low-dose computed tomography (LdCT). While it preserves edge sharpness, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it may compromise clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodules or colo...

  14. A new method to reconstruct hydrocarbon-generating histories of source rocks in a petroleum-bearing basin the method of geological and geochemical sections

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Via investigating typical Palaeozoic and Mesozoic petroleum-bearing basins in China by using thermal maturation theories of organic matter to improve the conventional Karweil's method,a new method to reconstruct hydrocarbon-generating histories of source rocks has been suggested.This method, combining geological background with geochemical information makes the calculated VRo closer to the measured one. Moreover, it enables us to make clear the hydrocarbon generation trend of source rocks during geological history. The method has the merits of simple calculation and objective presentation, especially suitable to basins whose sedimentation and tectonic movements are complicated.

  15. Reconstructing Land Use History from Landsat Time-Series. Case study of Swidden Agriculture Intensification in Brazil

    Science.gov (United States)

    Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.

    2015-12-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for

  16. Reconstructing land use history from Landsat time-series. Case study of a swidden agriculture system in Brazil

    Science.gov (United States)

    Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert

    2016-05-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified

  17. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  18. Paleosol charcoal : Reconstructing vegetation history in relation to agro−pastoral activities since the Neolithic. A case study in the Eastern French Pyrenees.

    OpenAIRE

    Bal, Marie; Bal, Marie-Claude; Rendu, Christine; Ruas, Marie-Pierre; Campmajo, Pierre

    2010-01-01

    International audience This article uses a method that combines pedoanthracological and pedo-archaeological approaches to terraces, complemented with archaeological pastoral data, in order to reconstruct the history of ancient agricultural terraces on a slope of the Enveitg Mountain in the French Pyrenees. Four excavations revealed two stages of terrace construction that have been linked with vegetation dynamics, which had been established by analyses of charcoal from the paleosols and soi...

  19. Neuromagnetic source reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Lewis, P.S.; Mosher, J.C. [Los Alamos National Lab., NM (United States); Leahy, R.M. [University of Southern California, Los Angeles, CA (United States)

    1994-12-31

    In neuromagnetic source reconstruction, a functional map of neural activity is constructed from noninvasive magnetoencephalographic (MEG) measurements. The overall reconstruction problem is under-determined, so some form of source modeling must be applied. We review the two main classes of reconstruction techniques-parametric current dipole models and nonparametric distributed source reconstructions. Current dipole reconstructions use a physically plausible source model, but are limited to cases in which the neural currents are expected to be highly sparse and localized. Distributed source reconstructions can be applied to a wider variety of cases, but must incorporate an implicit source, model in order to arrive at a single reconstruction. We examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms. We conclude with a brief discussion of alternative non-Gaussian approachs.

  20. Bayesian Compressed Sensing with Unknown Measurement Noise Level

    DEFF Research Database (Denmark)

    Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand;

    2013-01-01

    In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction a...

  1. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2016-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  2. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  3. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  4. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  5. Dendro-chronological analysis of fossilised wood in order to reconstruct the post-ice-age history of glaciers; Dendrochronologische Auswertung fossiler Hoelzer zur Rekonstruktion der nacheiszeitlichen Gletschergeschichte

    Energy Technology Data Exchange (ETDEWEB)

    Holzhauser, H.

    2008-07-01

    Around the middle of the 19th century, alpine glaciers advanced to their last maximum extension within the Holocene (the last 11'600 years). Some of the glaciers, especially the Great Aletsch and Gorner, penetrated deeply into wooded land and destroyed numerous trees. Not only were trees destroyed but also valuable arable farmland, alpine farm buildings and dwelling houses. Since the last maximum extension in the 19th century the retreat of the glaciers has accelerated revealing, within the glacier forefields, the remainders of trees once buried. Some of this fossil wood is found in the place where it grew (in situ). Often the wood dates back to a time before the last glacier advance, most of it is several thousands of years old because glacial advance and retreat periods occurred repeatedly within the Holocene. This paper shows the characteristics of fossil wood and how it can be analysed to reconstruct glacial history. It will be demonstrated how glacier length variation can be exactly reconstructed with help of dendrochronology. Thanks to the very exact reconstruction of the glacier length change during the advance periods in the 14th and 16th centuries, the velocities of both the Gorner and Great Aletsch glaciers can be estimated. They range between 7-8 and 20 m per year, in the case of the Gorner glacier, and between 7-8 and 36 m per year, in the case of the Great Aletsch glacier. (author)

  6. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  7. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  8. Bayesian Games with Intentions

    OpenAIRE

    Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael

    2016-01-01

    We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  9. 3D Surface Reconstruction and Automatic Camera Calibration

    Science.gov (United States)

    Jalobeanu, Andre

    2004-01-01

    Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.

  10. Reconstructing the Phylogenetic History of Long-Term Effective Population Size and Life-History Traits Using Patterns of Amino Acid Replacement in Mitochondrial Genomes of Mammals and Birds

    Science.gov (United States)

    Nabholz, Benoit; Lartillot, Nicolas

    2013-01-01

    The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more

  11. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  12. Niche partitioning in sympatric Gorilla and Pan from Cameroon: implications for life history strategies and for reconstructing the evolution of hominin life history.

    Directory of Open Access Journals (Sweden)

    Gabriele A Macho

    Full Text Available Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%, relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a more gradual and (b earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the

  13. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied ...

  14. Sclerochronology - a highly versatile tool for mariculture and reconstruction of life history traits of the queen conch, textit{Strombus gigas} (Gastropoda)

    Science.gov (United States)

    Radermacher, Pascal; Schöne, Bernd R.; Gischler, Eberhard; Oschmann, Wolfgang; Thébault, Julien; Fiebig, Jens

    2010-05-01

    The shell of the queen conch Strombus gigas provides a rapidly growing palaeoenvironmental proxy archive, allowing the detailed reconstruction of important life-history traits such as ontogeny, growth rate and growth seasonality. In this study, modern sclerochronological methods are used to cross-date the palaeotemperatures derived from the shell with local sea surface temperature (SST) records. The growth history of the shell suggests a bimodal seasonality in growth, with the growing season confined to the interval between April and November. In Glovers Reef, offshore Belize, the queen conch accreted shell carbonate at rates of up to 6 mm day-1 during the spring (April-June) and autumn (September-November). However a reduced period of growth occurred during the mid-summer months (July-August). The shell growth patterns indicate a positive response to annual seasonality with regards to precipitation. It seems likely that when precipitation levels are high, food availability is increased as the result of nutrient input to the ecosystem in correspondence with an increase in coastal runoff. Slow growth rates occur when precipitation, and as a consequence riverine runoff, is low. The SST however appears to influence growth only on a secondary level. Despite the bimodal growing season and the winter cessation in growth, the growth rates reconstructed here from two S. gigas shells are among the fastest yet reported for this species. The S. gigas specimens from Belize reached their final shell height (of 22.7 and 23.5 cm in distance between the apex and the siphonal notch) at the transition to adulthood in just 2 years. The extremely rapid growth as observed in this species permits detailed, high-resolution reconstructions of life-history traits where sub-daily resolutions can be achieved with ease. The potential for future studies has yet to be further explored. Queen conch sclerochronology provides an opportunity to recover extremely high-resolution palaeotemperature

  15. RECONSTRUCTING THE EARLY HISTORY OF THE UR Ⅲ STATE:SOME METHODOLOGICAL CONSIDERATIONS OF THE USE OF YEAR FORMULAE

    Institute of Scientific and Technical Information of China (English)

    Magnus; Widell; IHAC

    2002-01-01

    Ⅰ.INTRODUCTION AND BACKGROUNDIn the introduction of her recent book on Ur-Namma in the Sumerian literarytradition,E. Flckiger-Hawker offers a comprehensive account and summary ofthe previous scholarship of the complicated political history between the fall ofthe Akkadian empire and the beginning of the Ur Ⅲ period.From this account,itbecomes clear that this scholarship primarily relies on various interpretations of

  16. Reconstruction of the Transmission History of RNA Virus Outbreaks Using Full Genome Sequences: Foot-and-Mouth Disease Virus in Bulgaria in 2011

    DEFF Research Database (Denmark)

    Valdazo-González, Begoña; Polihronova, Lilyana; Alexandrov, Tsviatko;

    2012-01-01

    Improvements to sequencing protocols and the development of computational phylogenetics have opened up opportunities to study the rapid evolution of RNA viruses in real time. In practical terms, these results can be combined with field data in order to reconstruct spatiotemporal scenarios that de...... as a number of antibody positive wild boar on both sides of the border with Turkish Thrace. This study highlights how FGS analysis can be used as an effective on-the-spot tool to support and help direct epidemiological investigations of field outbreaks....... the origin and transmission history of the FMD outbreaks which occurred during 2011 in Burgas Province, Bulgaria, a country that had been previously FMD-free-without-vaccination since 1996. Nineteen full genome sequences (FGS) of FMD virus (FMDV) were generated and analysed, including eight representative...

  17. Impact of Quaternary climatic changes and interspecific competition on the demographic history of a highly mobile generalist carnivore, the coyote

    OpenAIRE

    Koblmüller, S; Robert K Wayne; Leonard, Jennifer A.

    2012-01-01

    Recurrent cycles of climatic change during the Quaternary period have dramatically affected the population genetic structure of many species. We reconstruct the recent demographic history of the coyote (Canis latrans) through the use of Bayesian techniques to examine the effects of Late Quaternary climatic perturbations on the genetic structure of a highly mobile generalist species. Our analysis reveals a lack of phylogeographic structure throughout the range but past population size changes ...

  18. The importance of natural history and research collections to environmental reconstruction and remediation, and the establishment of shifting baselines

    Science.gov (United States)

    Roopnarine, P. D.; Anderson, L.; Roopnarine, D.; Gillikin, D. P.; Leal, J.

    2012-12-01

    The Earth's environments are changing more rapidly today than at almost any time in the Phanerozoic. These changes are driven by human activities, and include climate change, landscape alteration, fragmentation and destruction, environmental pollution, species overexploitation, and invasive species. The rapidity of the changes challenges our best efforts to document what is changing, how it has changed, and what has been lost. Central to these efforts, therefore, is the proper documentation, archiving and curation of past environments. Natural history and other research collections form the core of this documentation, and have proven vital to recent studies of environmental change. Those collections are, however, generally under-utilized and under-appreciated by the general research community. Also, their utility is hampered by insufficient availability of the data, and the very nature of what has been collected in the past. Past collections emphasized a typological approach, placing emphasis on individual specimens and diversity, whether geological or biological, while what is needed today is greater emphasis on archiving entire environments. The concept of shifting baselines establishes that even on historical time scales, the notion of what constitutes an unaltered environment is biased by a lack of documentation and understanding of environments in the recent past. Baselines are necessary, however, for the proper implementation of mitigating procedures, for environmental restoration or remediation, and for predicting the near-term future. Here we present results from a study of impacts of the Deepwater Horizon oil spill (DWH) on the American oyster Crassostrea virginica. Natural history collections of specimens from the Gulf and elsewhere have been crucial to this effort, and serve as an example of how important such collections are to current events. We are examining the effects of spill exposure on shell growth and tissue development, as well as the potential

  19. Reconstructing the sedimentation history of the Bengal Delta Plain by means of geochemical and stable isotopic data

    International Nuclear Information System (INIS)

    The purpose of this study is to examine the sedimentation history of the central floodplain area of the Bengal Delta Plain in West Bengal, India. Sediments from two boreholes were analyzed regarding lithology, geochemistry and the stable isotopic composition of embedded organic matter. Different lithofacies were distinguished that reflect frequent changes in the prevailing sedimentary depositional environment of the study area. The lowest facies comprises poorly sorted fluvial sediments composed of fine gravel to clay pointing at high transport energy and intense relocation processes. This facies is considered to belong to an early Holocene lowstand systems tract that followed the last glacial maximum. Fine to medium sands above it mark a gradual change towards a transgressive systems tract. Upwards increasing proportions of silt and the stable isotopic composition of embedded organic matter both indicate a gradual change from fluvial channel infill sediments towards more estuarine and marine influenced deposits. Youngest sediments are composed of clayey and silty overbank deposits of the Hooghly River that have formed a vast low-relief delta-floodplain. Close to the surface, small concretions of secondary Mn-oxides and Fe-(oxyhydr)oxides occur and mark the fluctuation range of the unsaturated zone. These concretions are accompanied by relatively high contents of trace elements such as Zn, Ni, Cu, and As. To sum up, the outcomes of this study provide new insights into the complex sedimentation history of the barely investigated central floodplain area of West Bengal

  20. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  1. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  2. Micrometer-scale chemical and isotopic criteria (O and Si) on the origin and history of Precambrian cherts: Implications for paleo-temperature reconstructions

    Science.gov (United States)

    Marin-Carbonne, Johanna; Chaussidon, Marc; Robert, François

    2012-09-01

    Oxygen and silicon isotopes in cherts have been extensively used for the reconstruction of seawater temperature during the Precambrian. These reconstructions have been challenged because cherts can have various origins (hydrothermal, sedimentary, volcanic silicification) and their isotopic compositions might have been reset by metamorphic fluid circulation. Existing criteria used to assess the pristine sedimentary origin of a chert are based on petrography (criterion #1: chert is composed mostly of microquartz); on the bulk oxygen isotopic composition (criterion #2: bulk δ18O has to be close enough to the maximum δ18O value previously measured in other cherts of the same age); and on the presence of a large δ18O range at the micrometer scale (criterion #3: δ18O range of ˜10‰ at ˜2 μm). However, these criteria remain incomplete in determining precisely the origin and degree of preservation of ancient cherts. We report in situ Si and O isotope compositions and trace element concentrations in seven chert samples ranging from 1.88 to 3.5 Ga in age. Correlations between δ30Si and Al2O3 (and K2O, TiO2) reveal that microquartz is of three different origins, i.e. diagenetic, hydrothermal or silicification. Moreover, chert samples composed mostly of diagenetic microquartz show a large range of δ30Si at the micrometer scale (1.7-4.5‰), consistent with the large range of δ18O previously found in the Gunflint diagenetic cherts. We propose two further quantitative criteria to assess the origin, state of preservation and diagenetic history of cherts. Criterion #4 uses trace element concentrations coupled with δ30Si to ascribe the origin of cherts among three possible end-members (diagenetic, hydrothermal, and silicified). Criterion #5 is the presence of a large range of δ30Si in pure diagenetic microquartz. In the seven samples analyzed in this study, only one (from the Gunflint Iron formation at 1.88 Ga) passes all the criteria assessed here and can be used for

  3. An introduction to Gaussian Bayesian networks.

    Science.gov (United States)

    Grzegorczyk, Marco

    2010-01-01

    The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469

  4. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  5. Combining livestock production information in a process-based vegetation model to reconstruct the history of grassland management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; Peng, Shushi; Yue, Chao; Piao, Shilong; Wang, Tao; Hauglustaine, Didier A.; Soussana, Jean-Francois; Peregon, Anna; Kosykh, Natalya; Mironycheva-Tokareva, Nina

    2016-06-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5° by 0.5°. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 1901-2012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, rising CO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 × 106 km2 in 1901 to 12.3 × 106 km2 in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and interannual variability of grassland productivity at global scale well and thus is

  6. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    Energy Technology Data Exchange (ETDEWEB)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida, E-mail: menezes@cdtn.br, E-mail: cida@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Reator e Tecnicas Analiticas. Laboratorio de Ativacao Neutronica; Sabino, Claudia de V.S. [PUC-Minas, Belo Horizonte, MG (Brazil)

    2011-07-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, {sub k}0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  7. Estimating genealogies from linked marker data: a Bayesian approach

    Directory of Open Access Journals (Sweden)

    Sillanpää Mikko J

    2007-10-01

    Full Text Available Abstract Background Answers to several fundamental questions in statistical genetics would ideally require knowledge of the ancestral pedigree and of the gene flow therein. A few examples of such questions are haplotype estimation, relatedness and relationship estimation, gene mapping by combining pedigree and linkage disequilibrium information, and estimation of population structure. Results We present a probabilistic method for genealogy reconstruction. Starting with a group of genotyped individuals from some population isolate, we explore the state space of their possible ancestral histories under our Bayesian model by using Markov chain Monte Carlo (MCMC sampling techniques. The main contribution of our work is the development of sampling algorithms in the resulting vast state space with highly dependent variables. The main drawback is the computational complexity that limits the time horizon within which explicit reconstructions can be carried out in practice. Conclusion The estimates for IBD (identity-by-descent and haplotype distributions are tested in several settings using simulated data. The results appear to be promising for a further development of the method.

  8. Reconstructing southern Greenland Ice Sheet history during the intensification of Northern Hemisphere glaciation: Insights from IODP Site U1307

    Science.gov (United States)

    Blake-Mizen, Keziah; Bailey, Ian; Carlson, Anders; Stoner, Joe; Hatfield, Rob; Xuan, Chuang; Lawrence, Kira

    2016-04-01

    Should it ever melt entirely, the Greenland Ice Sheet (GIS) would contribute to ~7 metres of global sea-level rise. Understanding how the GIS might respond to anthropogenic-induced global warming over the coming century is therefore important. Central to this goal is constraining how this ice sheet has responded to radiative forcing during both warmer- and colder-than-present climate states in the geological past. Little is known in detail, however, about the GIS prior to the Late Pleistocene and large uncertainty exists in our understanding of its history across the last great climate transition during the Cenozoic, the intensification of Northern Hemisphere glaciation (iNHG; ~3.6-2.4 Ma). This time encompasses two intervals of interest: (1) the mid-Piacenzian warm period (mPWP, ~3.3-3 Ma), widely considered an analogue for a future equilibrium climate state when atmospheric CO2 levels were comparable to modern (~400 ppmv) and sea-level and global temperatures were elevated relative to today (by ~25 metres and ~2-3°C) and, (2) a subsequent gradual deterioration in global climate and decline in atmospheric CO2 that led to the development of Quaternary-magnitude glaciations from ~2.5 Ma. Important unresolved questions include: to what extent did the southern GIS retreat during the mPWP, and when did a modern-day sized GIS first develop during iNHG? To tackle these issues our project focuses on the southern GIS history that can be extracted from Eirik Drift IODP Site U1307 between ~3.3 and 2.2 Ma. To achieve this we are developing an independent orbital-resolution age model, one of the first for high-latitude marine sediments deposited during iNHG, by producing a relative paleointensity (RPI) record for Site U1307; and generating multi-proxy geochemical and sedimentological datasets that track the provenance of the sand and bulk terrigenous sediment fraction glacially eroded by the southern GIS and delivered to the study site by both ice-rafting and the Western

  9. On Fuzzy Bayesian Inference

    OpenAIRE

    Frühwirth-Schnatter, Sylvia

    1990-01-01

    In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)

  10. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  11. Reconstructing the evolutionary history of gypsy retrotransposons in the Périgord black truffle (Tuber melanosporum Vittad.).

    Science.gov (United States)

    Payen, Thibaut; Murat, Claude; Martin, Francis

    2016-08-01

    Truffles are ascomycete fungi belonging to genus Tuber, and they form ectomycorrhizal associations with trees and shrubs. Transposable elements constitute more than 50 % of the black Périgord truffle (Tuber melanosporum) genome, which are mainly class 1 gypsy retrotransposons, but their impact on its genome is unknown. The aims of this study are to investigate the diversity of gypsy retrotransposons in this species and their evolutionary history by analysing the reference genome and six resequenced genomes of different geographic accessions. Using the reverse transcriptase sequences, six different gypsy retrotransposon clades were identified. Tmt1 and Tmt6 are the most abundant transposable elements, representing 14 and 13 % of the T. melanosporum genome, respectively. Tmt6 showed a major burst of proliferation between 1 and 4 million years ago, but evidence of more recent transposition was observed. Except for Tmt2, the other clades tend to aggregate, and their mode of transposition excluded the master copy model. This suggests that each new copy has the same probability of transposing as other copies. This study provides a better view of the diversity and dynamic nature of gypsy retrotransposons in T. melanosporum. Even if the major gypsy retrotransposon bursts are old, some elements seem to have transposed recently, suggesting that they may continue to model the truffle genomes. PMID:27025914

  12. Galaxy formation in the Planck cosmology III: star-formation histories and post-processing magnitude reconstruction

    CERN Document Server

    Shamshiri, Sorour; Henriques, Bruno M; Tojeiro, Rita; Lemson, Gerard; Oliver, Seb J; Wilkins, Stephen

    2015-01-01

    We adapt the L-Galaxies semi-analytic model to follow the star-formation histories (SFH) of galaxies -- by which we mean a record of the formation time and metallicities of the stars that are present in each galaxy at a given time. We use these to construct stellar spectra in post-processing, which offers large efficiency savings and allows user-defined spectral bands and dust models to be applied to data stored in the Millennium data repository. We contrast model SFHs from the Millennium Simulation with observed ones from the VESPA algorithm as applied to the SDSS-7 catalogue. The overall agreement is good, with both simulated and SDSS galaxies showing a steeper SFH with increased stellar mass. The SFHs of blue and red galaxies, however, show poor agreement between data and simulations, which may indicate that the termination of star formation is too abrupt in the models. The mean star-formation rate (SFR) of model galaxies is well-defined and is accurately modelled by a double power law at all redshifts: SF...

  13. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  14. Long-term hydrological dynamics and fire history over the last 2000 years in CE Europe reconstructed from a high-resolution peat archive

    Science.gov (United States)

    Marcisz, Katarzyna; Tinner, Willy; Colombaroli, Daniele; Kołaczek, Piotr; Słowiński, Michał; Fiałkiewicz-Kozieł, Barbara; Łokas, Edyta; Lamentowicz, Mariusz

    2015-03-01

    Sphagnum peatlands in the oceanic-continental transition zone of Poland are currently influenced by climatic and anthropogenic factors that lead to peat desiccation and susceptibility to fire. Little is known about the response of Sphagnum peatland testate amoebae (TA) to the combined effects of drought and fire. To understand the relationships between hydrology and fire dynamics, we used high-resolution multi-proxy palaeoecological data to reconstruct 2000 years of mire history in northern Poland. We employed a new approach for Polish peatlands - joint TA-based water table depth and charcoal-inferred fire activity reconstructions. In addition, the response of most abundant TA hydrological indicators to charcoal-inferred fire activity was assessed. The results show four hydrological stages of peatland development: moderately wet (from ˜35 BC to 800 AD), wet (from ˜800 to 1390 AD), dry (from ˜1390 to 1700 AD) and with an instable water table (from ˜1700 to 2012 AD). Fire activity has increased in the last millennium after constant human presence in the mire surroundings. Higher fire activity caused a rise in the water table, but later an abrupt drought appeared at the onset of the Little Ice Age. This dry phase is characterized by high ash contents and high charcoal-inferred fire activity. Fires preceded hydrological change and the response of TA to fire was indirect. Peatland drying and hydrological instability was connected with TA community changes from wet (dominance of Archerella flavum, Hyalosphenia papilio, Amphitrema wrightianum) to dry (dominance of Cryptodifflugia oviformis, Euglypha rotunda); however, no clear fire indicator species was found. Anthropogenic activities can increase peat fires and cause substantial hydrology changes. Our data suggest that increased human fire activity was one of the main factors that influenced peatland hydrology, though the mire response through hydrological changes towards drier conditions was delayed in relation to

  15. New directions in hydro-climatic histories: observational data recovery, proxy records and the atmospheric circulation reconstructions over the earth (ACRE) initiative in Southeast Asia

    Science.gov (United States)

    Williamson, Fiona; Allan, Rob; Switzer, Adam D.; Chan, Johnny C. L.; Wasson, Robert James; D'Arrigo, Rosanne; Gartner, Richard

    2015-12-01

    The value of historic observational weather data for reconstructing long-term climate patterns and the detailed analysis of extreme weather events has long been recognized (Le Roy Ladurie, 1972; Lamb, 1977). In some regions however, observational data has not been kept regularly over time, or its preservation and archiving has not been considered a priority by governmental agencies. This has been a particular problem in Southeast Asia where there has been no systematic country-by-country method of keeping or preserving such data, the keeping of data only reaches back a few decades, or where instability has threatened the survival of historic records. As a result, past observational data are fragmentary, scattered, or even absent altogether. The further we go back in time, the more obvious the gaps. Observational data can be complimented however by historical documentary or proxy records of extreme events such as floods, droughts and other climatic anomalies. This review article highlights recent initiatives in sourcing, recovering, and preserving historical weather data and the potential for integrating the same with proxy (and other) records. In so doing, it focuses on regional initiatives for data research and recovery - particularly the work of the international Atmospheric Circulation Reconstructions over the Earth's (ACRE) Southeast Asian regional arm (ACRE SEA) - and the latter's role in bringing together disparate, but interrelated, projects working within this region. The overarching goal of the ACRE SEA initiative is to connect regional efforts and to build capacity within Southeast Asian institutions, agencies and National Meteorological and Hydrological Services (NMHS) to improve and extend historical instrumental, documentary and proxy databases of Southeast Asian hydroclimate, in order to contribute to the generation of high-quality, high-resolution historical hydroclimatic reconstructions (reanalyses) and, to build linkages with humanities researchers

  16. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    International Nuclear Information System (INIS)

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of 137Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform

  17. A Sparse Bayesian Estimation Framework for Conditioning Prior Geologic Models to Nonlinear Flow Measurements

    CERN Document Server

    Li, Lianlin

    2009-01-01

    We present a Bayesian framework for reconstruction of subsurface hydraulic properties from nonlinear dynamic flow data by imposing sparsity on the distribution of the solution coefficients in a compression transform domain.

  18. Reconstructing the lake-level history of former glacial lakes through the study of relict wave-cut terraces: the case of Lake Ojibway (eastern Canada)

    Science.gov (United States)

    Roy, Martin; Veillette, Jean; Daubois, Virginie

    2014-05-01

    The reconstruction of the history of former glacial lakes is commonly based on the study of strandlines that generally consist of boulder ridges, sandy beaches and other near-shore deposits. This approach, however, is limited in some regions where the surficial geology consists of thick accumulation of fine-grained glaciolacustrine sediments that mask most deglacial landforms. This situation is particularly relevant to the study of Lake Ojibway, a large proglacial lake that developed in northern Ontario and Quebec following the retreat of the southern Laurentide ice sheet margin during the last deglaciation. The history of Ojibway lake levels remains poorly known, mainly due to the fact that this lake occupied a deep and featureless basin that favored the sedimentation of thick sequences of rhythmites and prevented the formation of well-developed strandlines. Nonetheless, detailed mapping revealed a complex sequence of discontinuous small-scale cliffs that are scattered over the flat-lying Ojibway clay plain. These terrace-like features range in size from 4 to 7 m in height and can be followed for 10 to 100's of meters. These small-scale geomorphic features are interpreted to represent raised shorelines that were cut into glaciolacustrine sediments by lakeshore erosional processes (i.e., wave action). These so-called wave-cut scarps (WCS) occur at elevations ranging from 3 to 30 m above the present level of Lake Abitibi (267 m), one of the lowest landmarks in the area. Here we evaluate the feasibility of using this type of relict shorelines to constrain the evolution of Ojibway lake levels. For this purpose, a series of WCS were measured along four transects of about 40 km in length in the Lake Abitibi region. The absolute elevation of 154 WCS was determined with a Digital Video Plotter software package using 1:15K air-photos, coupled with precise measurements of control points, which were measured with a high-precision Global Navigation Satellite System tied up to

  19. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  20. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  1. Bayesian Lensing Shear Measurement

    CERN Document Server

    Bernstein, Gary M

    2013-01-01

    We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...

  2. Malicious Bayesian Congestion Games

    CERN Document Server

    Gairing, Martin

    2008-01-01

    In this paper, we introduce malicious Bayesian congestion games as an extension to congestion games where players might act in a malicious way. In such a game each player has two types. Either the player is a rational player seeking to minimize her own delay, or - with a certain probability - the player is malicious in which case her only goal is to disturb the other players as much as possible. We show that such games do in general not possess a Bayesian Nash equilibrium in pure strategies (i.e. a pure Bayesian Nash equilibrium). Moreover, given a game, we show that it is NP-complete to decide whether it admits a pure Bayesian Nash equilibrium. This result even holds when resource latency functions are linear, each player is malicious with the same probability, and all strategy sets consist of singleton sets. For a slightly more restricted class of malicious Bayesian congestion games, we provide easy checkable properties that are necessary and sufficient for the existence of a pure Bayesian Nash equilibrium....

  3. Bayesian molecular phylogenetics: estimation of divergence dates and hypothesis testing

    OpenAIRE

    Aris-Brosou, S.

    2002-01-01

    With the advent of automated sequencing, sequence data are now available to help us understand the functioning of our genome, as well as its history. To date,powerful methods such as maximum likelihood have been used to estimate its mode and tempo of evolution and its branching pattern. However, these methods appear to have some limitations. The purpose of this thesis is to examine these issues in light of Bayesian modelling, taking advantage of some recent advances in Bayesian compu...

  4. SOFOMORE: Combined EEG source and forward model reconstruction

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole;

    2009-01-01

    We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including the representat......We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including...... the representation of the cortical surface, conductivity distribution, and electrode positions. We demonstrate in both simulated and real EEG data that reconstruction of the forward model improves localization of the underlying sources....

  5. Reconstructing Native American Population History

    Science.gov (United States)

    Reich, David; Patterson, Nick; Campbell, Desmond; Tandon, Arti; Mazieres, Stéphane; Ray, Nicolas; Parra, Maria V.; Rojas, Winston; Duque, Constanza; Mesa, Natalia; García, Luis F.; Triana, Omar; Blair, Silvia; Maestre, Amanda; Dib, Juan C.; Bravi, Claudio M.; Bailliet, Graciela; Corach, Daniel; Hünemeier, Tábita; Bortolini, Maria-Cátira; Salzano, Francisco M.; Petzl-Erler, María Luiza; Acuña-Alonzo, Victor; Aguilar-Salinas, Carlos; Canizales-Quinteros, Samuel; Tusié-Luna, Teresa; Riba, Laura; Rodríguez-Cruz, Maricela; Lopez-Alarcón, Mardia; Coral-Vazquez, Ramón; Canto-Cetina, Thelma; Silva-Zolezzi, Irma; Fernandez-Lopez, Juan Carlos; Contreras, Alejandra V.; Jimenez-Sanchez, Gerardo; Gómez-Vázquez, María José; Molina, Julio; Carracedo, Ángel; Salas, Antonio; Gallo, Carla; Poletti, Giovanni; Witonsky, David B.; Alkorta-Aranburu, Gorka; Sukernik, Rem I.; Osipova, Ludmila; Fedorova, Sardana; Vasquez, René; Villena, Mercedes; Moreau, Claudia; Barrantes, Ramiro; Pauls, David; Excoffier, Laurent; Bedoya, Gabriel; Rothhammer, Francisco; Dugoujon, Jean Michel; Larrouy, Georges; Klitz, William; Labuda, Damian; Kidd, Judith; Kidd, Kenneth; Rienzo, Anna Di; Freimer, Nelson B.; Price, Alkes L.; Ruiz-Linares, Andrés

    2013-01-01

    The peopling of the Americas has been the subject of extensive genetic, archaeological and linguistic research; however, central questions remain unresolved1–5. One contentious issue is whether the settlement occurred via a single6–8 or multiple streams of migration from Siberia9–15. The pattern of dispersals within the Americas is also poorly understood. To address these questions at higher resolution than was previously possible, we assembled data from 52 Native American and 17 Siberian groups genotyped at 364,470 single nucleotide polymorphisms. We show that Native Americans descend from at least three streams of Asian gene flow. Most descend entirely from a single ancestral population that we call “First American”. However, speakers of Eskimo-Aleut languages from the Arctic inherit almost half their ancestry from a second stream of Asian gene flow, and the Na-Dene-speaking Chipewyan from Canada inherit roughly one-tenth of their ancestry from a third stream. We show that the initial peopling followed a southward expansion facilitated by the coast, with sequential population splits and little gene flow after divergence, especially in South America. A major exception is in Chibchan-speakers on both sides of the Panama Isthmus, who have ancestry from both North and South America. PMID:22801491

  6. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    Energy Technology Data Exchange (ETDEWEB)

    Boonyatumanond, Ruchaya [Environmental Research and Training Center, Pathumthani 12120 (Thailand); Wattayakorn, Gullaya [Department of Marine Science, Faculty of Science, Chulalongkorn University, Bangkok 10330 (Thailand); Amano, Atsuko [Graduate School of Science and Engineering, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Inouchi, Yoshio [Center for Marine Environmental Studies, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Takada, Hideshige [Laboratory of Organic Geochemistry, Tokyo University of Agriculture and Technology, Fuchu, Tokyo 183-8509 (Japan)]. E-mail: shige@cc.tuat.ac.jp

    2007-05-15

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of {sup 137}Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform.

  7. Dynamic Bayesian Network Model Based Golf Swing 3D Reconstruction Using Simple Depth Imaging Device%使用简易深度成像设备的高尔夫挥杆动态贝叶斯网络三维重建

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    基于简易深度成像设备的动作捕捉系统因其与传统设备相比更加廉价且易于使用而倍受关注。然而,此类设备图像分辨率很低,肢体间互相遮挡,缺乏3维动作重建的基本数据条件。该文融合人体关节点父子关系与关节点在运动中的多阶马尔可夫性,提出一个描述人体关节点空间关系与动态特性的动态贝叶斯网络(DBN)模型,基于该 DBN 模型并利用高尔夫挥杆运动的相似性,构建了一种高尔夫挥杆3维重建系统 DBN-Motion(DBN-based Motion reconstruction system),使用简易深度成像设备Kinect,有效地解决了肢体遮挡的问题,实现了高尔夫挥杆动作的捕获和3维重建。实验结果表明,该系统能够在重建精度上媲美商用光学动作捕捉系统。%The simple depth imaging device gains more and more attention because of its lower cost and easy- to-use property compared with traditional motion capture systems. However, this kind of devices lack the basic data condition of 3D motion reconstruction due to low resolution, occlusions, and mixing up of body parts. In this paper, a Dynamic Bayesian Network (DBN) model is proposed to describe the spatial and temporal characteristics of human body joints. The model is based on fusion of the parent-child characteristics of joints and multi-order Markov property of joint during motion. A golf swing capture and reconstruction system DBN-Motion (DBN-based Motion reconstruction system), is presented based on the DBN model and the similarity of swing with a simple depth imaging device, Kinect, as capturing device. The proposed system effectively solves the problem of occlusions and mixing up of body parts, and successfully captures and reconstructs golf swing in 3D space. Experimental results prove that the proposed system can achieve comparable reconstruction accuracy to the commercial optical motion caption system.

  8. Bayesian least squares deconvolution

    CERN Document Server

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  9. Bayesian least squares deconvolution

    Science.gov (United States)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  10. Hybrid Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2012-01-01

    Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...

  11. Bayesian Adaptive Exploration

    CERN Document Server

    Loredo, T J

    2004-01-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...

  12. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  13. Preliminary investigation of a Bayesian network for mammographic diagnosis of breast cancer.

    OpenAIRE

    Kahn, C. E.; Roberts, L. M.; K. Wang; Jenks, D.; Haddawy, P.

    1995-01-01

    Bayesian networks use the techniques of probability theory to reason under conditions of uncertainty. We investigated the use of Bayesian networks for radiological decision support. A Bayesian network for the interpretation of mammograms (MammoNet) was developed based on five patient-history features, two physical findings, and 15 mammographic features extracted by experienced radiologists. Conditional-probability data, such as sensitivity and specificity, were derived from peer-reviewed jour...

  14. Bayesian multiple target tracking

    CERN Document Server

    Streit, Roy L

    2013-01-01

    This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements

  15. Bayesian and frequentist inequality tests

    OpenAIRE

    David M. Kaplan; Zhuo, Longhao

    2016-01-01

    Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...

  16. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Science.gov (United States)

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method.

  17. Development and sustainability of NSF-funded climate change education efforts: lessons learned and strategies used to develop the Reconstructing Earth's Climate History (REaCH) curriculum (Invited)

    Science.gov (United States)

    St John, K. K.; Jones, M. H.; Leckie, R. M.; Pound, K. S.; Krissek, L. A.

    2013-12-01

    develop detailed instructor guides to accompany each module. After careful consideration of dissemination options, we choose to publish the full suite of exercise modules as a commercially-available book, Reconstructing Earth's Climate History, while also providing open online access to a subset of modules. Its current use in undergraduate paleoclimatology courses, and the availability of select modules for use in other courses demonstrate that creative, hybrid options can be found for lasting dissemination, and thus sustainability. In achieving our goal of making science accessible, we believe we have followed a curriculum development process and sustainability path that can be used by others to meet needs in earth, ocean, and atmospheric science education. Next steps for REaCH include exploration of its use in blended learning classrooms, and at minority serving institutions.

  18. Bayesian Dark Knowledge

    NARCIS (Netherlands)

    A. Korattikara; V. Rathod; K. Murphy; M. Welling

    2015-01-01

    We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple ap

  19. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  20. Bayesian Adaptive Exploration

    Science.gov (United States)

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  1. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...

  2. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  3. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...

  4. Sparse Bayesian learning in ISAR tomography imaging

    Institute of Scientific and Technical Information of China (English)

    SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang

    2015-01-01

    Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.

  5. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  6. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  7. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  8. Bayesian Word Sense Induction

    OpenAIRE

    Brody, Samuel; Lapata, Mirella

    2009-01-01

    Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...

  9. Bayesian Generalized Rating Curves

    OpenAIRE

    Helgi Sigurðarson 1985

    2014-01-01

    A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...

  10. Efficient Bayesian Phase Estimation

    Science.gov (United States)

    Wiebe, Nathan; Granade, Chris

    2016-07-01

    We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.

  11. Uncertainty estimation in reconstructed deformable models

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; McKee, R.

    1996-12-31

    One of the hallmarks of the Bayesian approach to modeling is the posterior probability, which summarizes all uncertainties regarding the analysis. Using a Markov Chain Monte Carlo (MCMC) technique, it is possible to generate a sequence of objects that represent random samples drawn from the posterior distribution. We demonstrate this technique for reconstructions of two-dimensional objects from noisy projections taken from two directions. The reconstructed object is modeled in terms of a deformable geometrically-defined boundary with a constant interior density yielding a nonlinear reconstruction problem. We show how an MCMC sequence can be used to estimate uncertainties in the location of the edge of the reconstructed object.

  12. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  13. Bayesian Attractor Learning

    Science.gov (United States)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  14. Reconstructing baryon oscillations

    OpenAIRE

    Noh, Yookyung; White, Martin; Padmanabhan, Nikhil

    2009-01-01

    The baryon acoustic oscillation (BAO) method for constraining the expansion history is adversely affected by non-linear structure formation, which washes out the correlation function peak created at decoupling. To increase the constraining power of low z BAO experiments, it has been proposed that one use the observed distribution of galaxies to "reconstruct'' the acoustic peak. Recently Padmanabhan, White and Cohn provided an analytic formalism for understanding how reconstruction works withi...

  15. Bayesian optimization for materials design

    OpenAIRE

    Frazier, Peter I.; Wang, Jialei

    2015-01-01

    We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...

  16. Reconstructing ancestral ranges in historical biogeography: properties and prospects

    Institute of Scientific and Technical Information of China (English)

    Kristin S. LAMM; Benjamin D. REDELINGS

    2009-01-01

    Recent years have witnessed a proliferation of quantitative methods for biogeographic inference. In particular, novel parametric approaches represent exciting new opportunities for the study of range evolution. Here, we review a selection of current methods for biogeographic analysis and discuss their respective properties. These methods include generalized parsimony approaches, weighted ancestral area analysis, dispersal-vicariance analysis, the dispersal-extinction-cladogenesis model and other maximum likelihood approaches, and Bayesian stochastic mapping of ancestral ranges, including a novel approach to inferring range evolution in the context of island biogeography. Some of these methods were developed specifically for problems of ancestral range reconstruction, whereas others were designed for more general problems of character state reconstruction and subsequently applied to the study of ancestral ranges. Methods for reconstructing ancestral history on a phylogenetic tree differ not only in the types of ancestral range states that are allowed, but also in the various historical events that may change the ancestral ranges. We explore how the form of allowed ancestral ranges and allowed transitions can both affect the outcome of ancestral range estimation. Finally, we mention some promising avenues for future work in the development of model-based approaches to biogeographic analysis.

  17. Bayesian Posteriors Without Bayes' Theorem

    CERN Document Server

    Hill, Theodore P

    2012-01-01

    The classical Bayesian posterior arises naturally as the unique solution of several different optimization problems, without the necessity of interpreting data as conditional probabilities and then using Bayes' Theorem. For example, the classical Bayesian posterior is the unique posterior that minimizes the loss of Shannon information in combining the prior and the likelihood distributions. These results, direct corollaries of recent results about conflations of probability distributions, reinforce the use of Bayesian posteriors, and may help partially reconcile some of the differences between classical and Bayesian statistics.

  18. Computational Imaging for VLBI Image Reconstruction

    OpenAIRE

    Bouman, Katherine L.; Johnson, Michael D; Zoran, Daniel; Fish, Vincent L.; Doeleman, Sheperd S.; Freeman, William T.

    2016-01-01

    Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other m...

  19. Using Bayesian Belief Networks and event trees for volcanic hazard assessment and decision support : reconstruction of past eruptions of La Soufrière volcano, Guadeloupe and retrospective analysis of 1975-77 unrest.

    Science.gov (United States)

    Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges

    2013-04-01

    Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to

  20. From the history of the recognitions of the remains to the reconstruction of the face of Dante Alighieri by means of techniques of virtual reality and forensic anthropology

    Directory of Open Access Journals (Sweden)

    Stefano Benazzi

    2007-07-01

    Full Text Available The work consists of the reconstruction of the face of the great poet called Dante Alighieri through a multidisciplinary approach that matches traditional techniques (manual ones, usually used in forensic anthropology, with digital methodologies that take advantage of technologies born in manufacturer-military fields but that are more and more often applied to the field of the cultural heritage. Unable to get the original skull of Dante, the work started from the data and the elements collected by Fabio Frassetto and Giuseppe Sergi, two important anthropologists, respectively at the University of Bologna and Rome, in an investigation carried out in 1921, sixth century anniversary of his death, on the remains of the poet collected in Ravenna. Thanks to this, we have a very accurate description of Dante’s bones, including 297 metric data inherent to the whole skeleton, some photographs in the scale of the skull, the various norms and many other bones, as well as a model of the skull subsequently realized by Frassetto. According to these information, a geometric reconstruction of Dante Alighieri skull including the jaw was carried out through the employment and integration of the instruments and technologies of the virtual reality, and from this the relative physical model through fast prototype was realized. An important aspect of the work regards in a particular manner the methodology of 3D modelling proposed for the new reconstruction of the jaw (not found in the course of the 1921 recognition, starting from a reference model. The model of the skull prototype is then useful as the basis for the successive stage of facial reconstruction through the traditional techniques of forensic art.

  1. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  2. Computationally efficient Bayesian tracking

    Science.gov (United States)

    Aughenbaugh, Jason; La Cour, Brian

    2012-06-01

    In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.

  3. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  4. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...

  5. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  6. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  7. Bayesian inference tools for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2013-08-01

    In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and

  8. Breast reconstruction: Correlation between different procedures, reconstruction timing and complications

    Directory of Open Access Journals (Sweden)

    Anđelkov Katarina

    2011-01-01

    Full Text Available Introduction. Improved psychophysical condition after breast reconstruction in women has been well documented Objective. To determine the most optimal technique with minimal morbidity, the authors examined their results and complications based on reconstruction timing (immediate and delayed reconstruction and three reconstruction methods: TRAM flap, latissimus dorsi flap and reconstruction with tissue expanders and implants. Methods. Reconstruction was performed in 60 women of mean age 51.1 years. We analyzed risk factors: age, body mass index (BMI, smoking history and radiation therapy in correlation with timing and method of reconstruction. Complications of all three methods of reconstruction were under 1.5-2-year follow-up after the reconstruction. All data were statistically analyzed. Results. Only radiation had significant influence on the occurrence of complications both before and after the reconstruction, while age, smoking and BMI had no considerable influence of the development of complications. There were no statistically significant correlation between the incidence of complications, time and method of reconstruction. Conclusion. Any of the aforementioned breast reconstruction techniques can yield good results and a low rate of re-operations. To choose the best method, the patient needs to be as well informed as possible about the options including the risks and benefits of each method.

  9. Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions

    Directory of Open Access Journals (Sweden)

    Richard M. Todd

    1988-03-01

    Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.

  10. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne;

    2011-01-01

    Studies of complications following reconstructive surgery with implants among women with breast cancer are needed. As the, to our knowledge, first prospective long-term study we evaluated the occurrence of complications following delayed breast reconstruction separately for one- and two......-stage procedures. From the Danish Registry for Plastic Surgery of the Breast, which has prospectively registered data for women undergoing breast implantations since 1999, we identified 559 women without a history of radiation therapy undergoing 592 delayed breast reconstructions following breast cancer during...... of reoperation was significantly higher following the one-stage procedure. For both procedures, the majority of reoperations were due to asymmetry or displacement of the implant. In conclusion, non-radiated one- and two-stage delayed breast implant reconstructions are associated with substantial risks...

  11. Character State Reconstruction of Call Diversity in the Neoconocephalus Katydids Reveals High Levels of Convergence.

    Science.gov (United States)

    Frederick, Katy; Schul, Johannes

    2016-01-01

    The katydid genus Neoconocephalus is characterized by high diversity of the acoustic communication system. Both male signals and female preferences have been thoroughly studied in the past. This study used Bayesian character state reconstruction to elucidate the evolutionary history of diverse call traits, based on an existing, well supported phylogenetic hypothesis. The most common male call pattern consisted of continuous calls comprising one fast pulse rate; this pattern is the likely ancestral state in this genus. Three lines of call divergence existed among the species of the genus. First, four species had significantly slower pulse rates. Second, five species had alternating pulse periods, resulting in a double pulse rhythm. Third, several species had discontinuous calls, when pulses were grouped into rhythmically repeated verses. Bayesian character state reconstruction revealed that the double-pulse pattern likely evolved convergently five times; the slow pulse rate also evolved four times independently. Discontinuous calls have evolved twice and occur in two clades; each of which contains reversals to the ancestral continuous calls. Pairwise phylogenetically independent contrast analyses among the three call traits found no significant correlations among the character states of the different traits, supporting the independent evolution of the three call traits. PMID:27110432

  12. Dynamic Bayesian diffusion estimation

    CERN Document Server

    Dedecius, K

    2012-01-01

    The rapidly increasing complexity of (mainly wireless) ad-hoc networks stresses the need of reliable distributed estimation of several variables of interest. The widely used centralized approach, in which the network nodes communicate their data with a single specialized point, suffers from high communication overheads and represents a potentially dangerous concept with a single point of failure needing special treatment. This paper's aim is to contribute to another quite recent method called diffusion estimation. By decentralizing the operating environment, the network nodes communicate just within a close neighbourhood. We adopt the Bayesian framework to modelling and estimation, which, unlike the traditional approaches, abstracts from a particular model case. This leads to a very scalable and universal method, applicable to a wide class of different models. A particularly interesting case - the Gaussian regressive model - is derived as an example.

  13. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  14. Reconstructing the Chronology of Supernovae: Determining Major Variations in the History of the Cosmic-ray Flux Incident on the Earth's Surface by Measuring the Concentration of 22Ne in Halite

    Science.gov (United States)

    Nahill, N. D.; Giegengack, R.; Lande, K.; Omar, G.

    2008-12-01

    We plan to measure the inventory of cosmogenically produced 22Ne atoms preserved in the mineral lattice of halite in deposits of rock salt, and to use that inventory to measure variations in the cosmic-ray flux to enable us to reconstruct the history of supernovae. Bedded rock salt consists almost entirely of the mineral halite (NaCl). Any neon trapped in the halite crystals during precipitation is primarily 20Ne, with a 22Ne concentration of 9% or less. Any neon resulting from cosmic-ray interactions with 23Na is solely 22Ne; therefore, 22Ne atoms in excess of 9% of the total neon are cosmogenic in origin. Measurement of the 22Ne inventory in halite from deposits covering a range of geologic ages may enable us to document the systematic growth of 22Ne through geologic time and, thus, establish the cosmic-ray flux and a chronology of supernovae. The cosmic-ray flux is attenuated in direct proportion to the mass of material overlying a halite deposit. To adjust the 22Ne inventory to account for that attenuation, we must reconstruct the post-depositional history of accumulation and removal of superjacent sediment for each halite deposit we study. As an example of our procedure, we reconstruct here the shielding history of the Permian halite deposit, the Salado Formation, Delaware Basin, New Mexico. The stratigraphy of the Delaware Basin has been well documented via exploration and production wells drilled in search of oil and gas, exploration boreholes associated with potash mining, and comprehensive geologic site assessment of the DOE Waste Isolation Pilot Plant (WIPP). WIPP is a subsurface repository for the permanent disposal of transuranic wastes, located in southeastern New Mexico, 42 km east of Carlsbad and approximately 655 m beneath the surface in the Salado Fm. The Salado Fm is part of the Late Permian Ochoan Series, and consists of 1) a lower member, 2) the McNutt Potash Zone, and 3) an upper member. WIPP lies between marker bed (MB)139 and MB136 in the

  15. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  16. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  17. Irregular-Time Bayesian Networks

    CERN Document Server

    Ramati, Michael

    2012-01-01

    In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...

  18. Inherently irrational? A computational model of escalation of commitment as Bayesian Updating.

    Science.gov (United States)

    Gilroy, Shawn P; Hantula, Donald A

    2016-06-01

    Monte Carlo simulations were performed to analyze the degree to which two-, three- and four-step learning histories of losses and gains correlated with escalation and persistence in extended extinction (continuous loss) conditions. Simulated learning histories were randomly generated at varying lengths and compositions and warranted probabilities were determined using Bayesian Updating methods. Bayesian Updating predicted instances where particular learning sequences were more likely to engender escalation and persistence under extinction conditions. All simulations revealed greater rates of escalation and persistence in the presence of heterogeneous (e.g., both Wins and Losses) lag sequences, with substantially increased rates of escalation when lags comprised predominantly of losses were followed by wins. These methods were then applied to human investment choices in earlier experiments. The Bayesian Updating models corresponded with data obtained from these experiments. These findings suggest that Bayesian Updating can be utilized as a model for understanding how and when individual commitment may escalate and persist despite continued failures.

  19. Neuronanatomy, neurology and Bayesian networks

    OpenAIRE

    Bielza Lozoya, Maria Concepcion

    2014-01-01

    Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...

  20. Analysis of trace elements in the shells of short-necked clam Ruditapes philippinarum (Mollusca: Bivalvia) with respect to reconstruction of individual life history

    International Nuclear Information System (INIS)

    Strontium (Sr) concentration in the shells of short-necked clams collected at different locations (Shirahama, warm area and Maizuru, cold area, Japan) was analyzed by two methods, PIXE and EPMA. The Sr concentration of external surface of shell umbo, which was made during short term at early benthic phase, was analyzed by PIXE, and was ranged from 1000 to 3500 ppm for individuals. The Sr concentration of clams collected at Shirahama showed positive correlation with shell length (SL) in individuals with SL < 31 mm, whereas clams collected at Maizuru did not show significant correlation. This result may be caused from the difference of the spawning seasons between two areas. The Sr concentration of cross section of shell umbo, which develops thicker continuously during their life to form faint stratum structure, was analyzed by EPMA along the line across the stratum structure. Some surges and long term waving patterns of the Sr concentration were observed. These results suggest that the life histories of individual clams could be recorded in the shell umbo cross sections as variations of trace elements and analyses of trace elements could clarify the histories of individual clams. (author)

  1. Analysis of trace elements in the shells of short-necked clam Ruditapes philippinarum (Mollusca: Bivalvia) with respect to reconstruction of individual life history

    Energy Technology Data Exchange (ETDEWEB)

    Arakawa, Jumpei; Sakamoto, Wataru [Division of Applied Biosciences, Graduate School of Agriculture, Kyoto Univ., Kyoto (Japan)

    1998-07-01

    Strontium (Sr) concentration in the shells of short-necked clams collected at different locations (Shirahama, warm area and Maizuru, cold area, Japan) was analyzed by two methods, PIXE and EPMA. The Sr concentration of external surface of shell umbo, which was made during short term at early benthic phase, was analyzed by PIXE, and was ranged from 1000 to 3500 ppm for individuals. The Sr concentration of clams collected at Shirahama showed positive correlation with shell length (SL) in individuals with SL < 31 mm, whereas clams collected at Maizuru did not show significant correlation. This result may be caused from the difference of the spawning seasons between two areas. The Sr concentration of cross section of shell umbo, which develops thicker continuously during their life to form faint stratum structure, was analyzed by EPMA along the line across the stratum structure. Some surges and long term waving patterns of the Sr concentration were observed. These results suggest that the life histories of individual clams could be recorded in the shell umbo cross sections as variations of trace elements and analyses of trace elements could clarify the histories of individual clams. (author)

  2. Unfavourable results in thumb reconstruction

    Directory of Open Access Journals (Sweden)

    Samir M Kumta

    2013-01-01

    Full Text Available The history of thumb reconstruction parallels the history of hand surgery. The attributes that make the thumb unique, and that the reconstructive surgeon must assess and try to restore when reconstructing a thumb, are: Position, stability, strength, length, motion, sensibility and appearance. Deficiency in any of these attributes can reduce the utility of the reconstructed thumb. A detailed assessment of the patient and his requirements needs to be performed before embarking on a thumb reconstruction. Most unsatisfactory results can be attributed to wrong choice of procedure. Component defects of the thumb are commonly treated by tissue from adjacent fingers, hand or forearm. With refinements in microsurgery, the foot has become a major source of tissue for component replacement in the thumb. Bone lengthening, osteoplastic reconstruction, pollicisation, and toe to hand transfers are the commonest methods of thumb reconstruction. Unfavourable results can be classified as functional and aesthetic. Some are common to all types of procedures. However each type of reconstruction has its own unique set of problems. Meticulous planning and execution is essential to give an aesthetic and functionally useful thumb. Secondary surgeries like tendon transfers, bone grafting, debulking, arthrodesis, may be required to correct deficiencies in the reconstruction. Attention needs to be paid to the donor site as well.

  3. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne;

    2012-01-01

    We evaluated the association between radiation therapy and severe capsular contracture or reoperation after 717 delayed breast implant reconstruction procedures (288 1- and 429 2-stage procedures) identified in the prospective database of the Danish Registry for Plastic Surgery of the Breast during...... of radiation therapy was associated with a non-significantly increased risk of reoperation after both 1-stage (HR = 1.4; 95% CI: 0.7-2.5) and 2-stage (HR = 1.6; 95% CI: 0.9-3.1) procedures. Reconstruction failure was highest (13.2%) in the 2-stage procedures with a history of radiation therapy. Breast...

  4. Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap

    OpenAIRE

    Dale Poirier

    2008-01-01

    This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.

  5. Dynamic Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2011-01-01

    Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...

  6. Nonparametric Bayesian Classification

    CERN Document Server

    Coram, M A

    2002-01-01

    A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...

  7. Bayesian Kinematic Finite Fault Source Models (Invited)

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.

    2010-12-01

    Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.

  8. History of human activity in last 800 years reconstructed from combined archive data and high-resolution analyses of varved lake sediments from Lake Czechowskie, Northern Poland

    Science.gov (United States)

    Słowiński, Michał; Tyszkowski, Sebastian; Ott, Florian; Obremska, Milena; Kaczmarek, Halina; Theuerkauf, Martin; Wulf, Sabine; Brauer, Achim

    2016-04-01

    The aim of the study was to reconstruct human and landscape development in the Tuchola Pinewoods (Northern Poland) during the last 800 years. We apply an approach that combines historic maps and documents with pollen data. Pollen data were obtained from varved lake sediments at a resolution of 5 years. The chronology of the sediment record is based on varve counting, AMS 14C dating, 137Cs activity concentration measurements and tephrochronology (Askja AD 1875). We applied the REVEALS model to translate pollen percentage data into regional plant abundances. The interpretation of the pollen record is furthermore based on pollen accumulation rate data. The pollen record and historic documents show similar trends in vegetation development. During the first phase (AD 1200-1412), the Lake Czechowskie area was still largely forested with Quercus, Carpinus and Pinus forests. Vegetation was more open during the second phase (AD 1412-1776), and reached maximum openness during the third phase (AD 1776-1905). Furthermore, intensified forest management led to a transformation from mixed to pine dominated forests during this period. Since the early 20th century, the forest cover increased again with dominance of the Scots pine in the stand. While pollen and historic data show similar trends, they differ substantially in the degree of openness during the four phases with pollen data commonly suggesting more open conditions. We discuss potential causes for this discrepancy, which include unsuitable parameters settings in REVEALS and unknown changes in forest structure. Using pollen accumulation data as a third proxy record we aim to identify the most probable causes. Finally, we discuss the observed vegetation change in relation the socio-economic development of the area. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution Analysis - ICLEA- of the Helmholtz Association and National Science Centre, Poland (grant No. 2011/01/B/ST10

  9. Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs

    OpenAIRE

    Mansinghka, Vikash K.; Kulkarni, Tejas D.; Perov, Yura N.; Tenenbaum, Joshua B.

    2013-01-01

    The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to automatically invert them to interpret real-world images. Generative probabilistic graphics program...

  10. Research of Gene Regulatory Network with Multi-Time Delay Based on Bayesian Network

    Institute of Scientific and Technical Information of China (English)

    LIU Bei; MENG Fanjiang; LI Yong; LIU Liyan

    2008-01-01

    The gene regulatory network was reconstructed according to time-series microarray data getting from hybridization at different time between gene chips to analyze coordination and restriction between genes. An algorithm for controlling the gene expression regulatory network of the whole cell was designed using Bayesian network which provides an effective aided analysis for gene regulatory network.

  11. Sparse Bayesian Learning for DOA Estimation with Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Jisheng Dai

    2015-10-01

    Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.

  12. Evaluation of Bayesian tensor estimation using tensor coherence

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae-Jin; Park, Hae-Jeong [Laboratory of Molecular Neuroimaging Technology, Brain Korea 21 Project for Medical Science, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, In-Young [Department of Biomedical Engineering, Hanyang University, Seoul (Korea, Republic of); Jeong, Seok-Oh [Department of Statistics, Hankuk University of Foreign Studies, Yongin (Korea, Republic of)], E-mail: parkhj@yuhs.ac

    2009-06-21

    Fiber tractography, a unique and non-invasive method to estimate axonal fibers within white matter, constructs the putative streamlines from diffusion tensor MRI by interconnecting voxels according to the propagation direction defined by the diffusion tensor. This direction has uncertainties due to the properties of underlying fiber bundles, neighboring structures and image noise. Therefore, robust estimation of the diffusion direction is essential to reconstruct reliable fiber pathways. For this purpose, we propose a tensor estimation method using a Bayesian framework, which includes an a priori probability distribution based on tensor coherence indices, to utilize both the neighborhood direction information and the inertia moment as regularization terms. The reliability of the proposed tensor estimation was evaluated using Monte Carlo simulations in terms of accuracy and precision with four synthetic tensor fields at various SNRs and in vivo human data of brain and calf muscle. Proposed Bayesian estimation demonstrated the relative robustness to noise and the higher reliability compared to the simple tensor regression.

  13. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  14. Bayesian compressive sensing for ultrawideband inverse scattering in random media

    CERN Document Server

    Fouda, A E

    2014-01-01

    We develop an ultrawideband (UWB) inverse scattering technique for reconstructing continuous random media based on Bayesian compressive sensing. In addition to providing maximum a posteriori estimates of the unknown weights, Bayesian inversion provides estimate of the confidence level of the solution, as well as a systematic approach for optimizing subsequent measurement(s) to maximize information gain. We impose sparsity priors directly on spatial harmonics to exploit the spatial correlation exhibited by continuous media, and solve for their posterior probability density functions efficiently using a fast relevance vector machine. We linearize the problem using the first-order Born approximation which enables us to combine, in a single inversion, measurements from multiple transmitters and ultrawideband frequencies. We extend the method to high-contrast media using the distorted-Born iterative method. We apply time-reversal strategies to adaptively focus the inversion effort onto subdomains of interest, and ...

  15. Investigation of limited-view image reconstruction in optoacoustic tomography employing a priori structural information

    Science.gov (United States)

    Huang, Chao; Oraevsky, Alexander A.; Anastasio, Mark A.

    2010-08-01

    Optoacoustic tomography (OAT) is an emerging ultrasound-mediated biophotonic imaging modality that has exciting potential for many biomedical imaging applications. There is great interest in conducting B-mode ultrasound and OAT imaging studies for breast cancer detection using a common transducer. In this situation, the range of tomographic view angles is limited, which can result in distortions in the reconstructed OAT image if conventional reconstruction algorithms are applied to limited-view measurement data. In this work, we investigate an image reconstruction method that utilizes information regarding target boundaries to improve the quality of the reconstructed OAT images. This is accomplished by developing boundary-constrained image reconstruction algorithm for OAT based on Bayesian image reconstruction theory. The computer-simulation studies demonstrate that the Bayesian approach can effectively reduce the artifact and noise levels and preserve the edges in reconstructed limited-view OAT images as compared to those produced by a conventional OAT reconstruction algorithm.

  16. History of land use in India during 1880-2010: Large-scale land transformations reconstructed from satellite data and historical archives

    Science.gov (United States)

    Tian, Hanqin; Banger, Kamaljit; Bo, Tao; Dadhwal, Vinay K.

    2014-10-01

    In India, human population has increased six-fold from 200 million to 1200 million that coupled with economic growth has resulted in significant land use and land cover (LULC) changes during 1880-2010. However, large discrepancies in the existing LULC datasets have hindered our efforts to better understand interactions among human activities, climate systems, and ecosystem in India. In this study, we incorporated high-resolution remote sensing datasets from Resourcesat-1 and historical archives at district (N = 590) and state (N = 30) levels to generate LULC datasets at 5 arc minute resolution during 1880-2010 in India. Results have shown that a significant loss of forests (from 89 million ha to 63 million ha) has occurred during the study period. Interestingly, the deforestation rate was relatively greater under the British rule (1880-1950s) and early decades after independence, and then decreased after the 1980s due to government policies to protect the forests. In contrast to forests, cropland area has increased from 92 million ha to 140.1 million ha during 1880-2010. Greater cropland expansion has occurred during the 1950-1980s that coincided with the period of farm mechanization, electrification, and introduction of high yielding crop varieties as a result of government policies to achieve self-sufficiency in food production. The rate of urbanization was slower during 1880-1940 but significantly increased after the 1950s probably due to rapid increase in population and economic growth in India. Our study provides the most reliable estimations of historical LULC at regional scale in India. This is the first attempt to incorporate newly developed high-resolution remote sensing datasets and inventory archives to reconstruct the time series of LULC records for such a long period in India. The spatial and temporal information on LULC derived from this study could be used by ecosystem, hydrological, and climate modeling as well as by policy makers for assessing the

  17. 中国心理学思想史范畴体系的重建%Reconstruction of Category's System of History of Chinese Psychological Thoughts

    Institute of Scientific and Technical Information of China (English)

    彭彦琴

    2001-01-01

    中国心理学思想史理论层次的进一步提升呼唤着新的范畴体系的建构。本文分析了已有范畴说存在的不足,提出了范畴建构的几条原则,最后详尽解说了在天人合一背景中的以人性为元范畴所构建的中国心理学思想史范畴体系,它向人们展示了其内在的逻辑脉络,精深内蕴及鲜明个性。%The elevation of theory of history of chinese psychological thoughts calls the construction of new category's system. This paper analyses the short coming of old category' s construction, explaining the category's system of Chinese psychological thoughts that is based on human.n nature - metacategory whose background is oneness heaven and man, that displays its internal logical vein. profound implication and distinct character.

  18. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  19. The subjectivity of scientists and the Bayesian statistical approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  20. Penile reconstruction

    Institute of Scientific and Technical Information of China (English)

    Giulio Garaffa; Salvatore Sansalone; David J Ralph

    2013-01-01

    During the most recent years,a variety of new techniques of penile reconstruction have been described in the literature.This paper focuses on the most recent advances in male genital reconstruction after trauma,excision of benign and malignant disease,in gender reassignment surgery and aphallia with emphasis on surgical technique,cosmetic and functional outcome.

  1. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...

  2. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  3. Bayesian Methods and Universal Darwinism

    CERN Document Server

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...

  4. Reconstruction of the Earthquake History of Limestone Fault Scarps in Knidos Fault Zone Using in-situ Chlorine-36 Exposure Dating and "R" Programming Language

    Science.gov (United States)

    Sahin, Sefa; Yildirim, Cengiz; Akif Sarikaya, Mehmet; Tuysuz, Okan; Genc, S. Can; Ersen Aksoy, Murat; Ertekin Doksanalti, Mustafa

    2016-04-01

    Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of the fault, topography, geographic location of the study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Recently developed models provides a method to infer timing of earthquakes and slip rates on limited scales by taking into account these parameters. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to generate cosmogenic chlorine-36 (36Cl) dating models. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. The main purpose of this study is to analyze factors affecting the production rates and amount of cosmogenic 36Cl nuclides concentration. Concentration of Cl36 isotopes are measured by AMS laboratories. Through the local production rates and concentration of the cosmic isotopes, we can calculate exposure ages of the samples. Recent research elucidated each step of the application of this method by the Matlab programming language (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models of Quaternary activity of the normal faults. We, however, wanted to build a user-friendly program through an open source programing language "R" (GNU Project) that might be able to help those without knowledge of complex math programming, making calculations as easy and understandable as

  5. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  6. Bayesian test and Kuhn's paradigm

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoping

    2006-01-01

    Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.

  7. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  8. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  9. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  10. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  11. Iterative image reconstruction in ECT

    International Nuclear Information System (INIS)

    A series of preliminary studies has been performed in the authors laboratories to explore the use of a priori information in Bayesian image restoration and reconstruction. One piece of a priori information is the fact that intensities of neighboring pixels tend to be similar if they belong to the same region within which similar tissue characteristics are exhibited. this property of local continuity can be modeled by the use of Gibbs priors, as first suggested by German and Geman. In their investigation, they also included line sites between each pair of neighboring pixels in the Gibbs prior and used discrete binary numbers to indicate the absence or presence of boundaries between regions. These two features of the a priori model permit averaging within boundaries of homogeneous regions to alleviate the degradation caused by Poisson noise. with the use of this Gibbs prior in combination with the technique of stochastic relaxation, German and Geman demonstrated that noise levels can be reduced significantly in 2-D image restoration. They have developed a Bayesian method that utilizes a Gibbs prior to describe the spatial correlation of neighboring regions and takes into account the effect of limited spatial resolution as well. The statistical framework of the proposed approach is based on the data augmentation scheme suggested by Tanner and Wong. Briefly outlined here, this Bayesian method is based on Geman and Geman's approach

  12. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    C. Dimitrakakis

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  13. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup

  14. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  15. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  16. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  17. Bayesian Agglomerative Clustering with Coalescents

    OpenAIRE

    Teh, Yee Whye; Daumé III, Hal; Roy, Daniel

    2009-01-01

    We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.

  18. Bayesian NL interpretation and learning

    NARCIS (Netherlands)

    H. Zeevat

    2011-01-01

    Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language

  19. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  20. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  1. Bayesian stable isotope mixing models

    Science.gov (United States)

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...

  2. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  3. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  4. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  5. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  6. Bayesian Analysis of Experimental Data

    Directory of Open Access Journals (Sweden)

    Lalmohan Bhar

    2013-10-01

    Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.

  7. Phylodynamic reconstruction of O CATHAY topotype foot-and-mouth disease virus epidemics in the Philippines.

    Science.gov (United States)

    Di Nardo, Antonello; Knowles, Nick J; Wadsworth, Jemma; Haydon, Daniel T; King, Donald P

    2014-01-01

    Reconstructing the evolutionary history, demographic signal and dispersal processes from viral genome sequences contributes to our understanding of the epidemiological dynamics underlying epizootic events. In this study, a Bayesian phylogenetic framework was used to explore the phylodynamics and spatio-temporal dispersion of the O CATHAY topotype of foot-and-mouth disease virus (FMDV) that caused epidemics in the Philippines between 1994 and 2005. Sequences of the FMDV genome encoding the VP1 showed that the O CATHAY FMD epizootic in the Philippines resulted from a single introduction and was characterised by three main transmission hubs in Rizal, Bulacan and Manila Provinces. From a wider regional perspective, phylogenetic reconstruction of all available O CATHAY VP1 nucleotide sequences identified three distinct sub-lineages associated with country-based clusters originating in Hong Kong Special Administrative Region (SAR), the Philippines and Taiwan. The root of this phylogenetic tree was located in Hong Kong SAR, representing the most likely source for the introduction of this lineage into the Philippines and Taiwan. The reconstructed O CATHAY phylodynamics revealed three chronologically distinct evolutionary phases, culminating in a reduction in viral diversity over the final 10 years. The analysis suggests that viruses from the O CATHAY topotype have been continually maintained within swine industries close to Hong Kong SAR, following the extinction of virus lineages from the Philippines and the reduced number of FMD cases in Taiwan.

  8. Human migration patterns in Yemen and implications for reconstructing prehistoric population movements.

    Directory of Open Access Journals (Sweden)

    Aida T Miró-Herrans

    Full Text Available Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102 and mean and median distances of migration (96 km and 26 km for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing

  9. Climate Reconstructions

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Paleoclimatology Program archives reconstructions of past climatic conditions derived from paleoclimate proxies, in addition to the Program's large...

  10. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  11. Laryngopharyngeal reconstruction

    OpenAIRE

    Kazi, Rehan A

    2006-01-01

    There is a high incidence of hypopharyngeal cancer is our country due to the habits of tobacco and alcohol. Moreover these cases are often detected in the late stages thereby making the issue of reconstruction very tedious and unpredictable. There are a number of options for laryngopharyngeal reconstruction available now including the use of microvascular flaps depending upon the patient’s fitness, motivation, technical expertise, size and extent of the defect. This article reviews the differ...

  12. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  13. Bayesian networks for evaluation of evidence from forensic entomology.

    Science.gov (United States)

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  14. Exemplar models as a mechanism for performing Bayesian inference.

    Science.gov (United States)

    Shi, Lei; Griffiths, Thomas L; Feldman, Naomi H; Sanborn, Adam N

    2010-08-01

    Probabilistic models have recently received much attention as accounts of human cognition. However, most research in which probabilistic models have been used has been focused on formulating the abstract problems behind cognitive tasks and their optimal solutions, rather than on mechanisms that could implement these solutions. Exemplar models are a successful class of psychological process models in which an inventory of stored examples is used to solve problems such as identification, categorization, and function learning. We show that exemplar models can be used to perform a sophisticated form of Monte Carlo approximation known as importance sampling and thus provide a way to perform approximate Bayesian inference. Simulations of Bayesian inference in speech perception, generalization along a single dimension, making predictions about everyday events, concept learning, and reconstruction from memory show that exemplar models can often account for human performance with only a few exemplars, for both simple and relatively complex prior distributions. These results suggest that exemplar models provide a possible mechanism for implementing at least some forms of Bayesian inference. PMID:20702863

  15. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  16. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...

  17. Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods

    OpenAIRE

    Zhu, Weixuan

    2016-01-01

    The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...

  18. ACG: rapid inference of population history from recombining nucleotide sequences

    Directory of Open Access Journals (Sweden)

    O'Fallon Brendan D

    2013-02-01

    Full Text Available Abstract Background Reconstruction of population history from genetic data often requires Monte Carlo integration over the genealogy of the samples. Among tools that perform such computations, few are able to consider genetic histories including recombination events, precluding their use on most alignments of nuclear DNA. Explicit consideration of recombinations requires modeling the history of the sequences with an Ancestral Recombination Graph (ARG in place of a simple tree, which presents significant computational challenges. Results ACG is an extensible desktop application that uses a Bayesian Markov chain Monte Carlo procedure to estimate the posterior likelihood of an evolutionary model conditional on an alignment of genetic data. The ancestry of the sequences is represented by an ARG, which is estimated from the data with other model parameters. Importantly, ACG computes the full, Felsenstein likelihood of the ARG, not a pairwise or composite likelihood. Several strategies are used to speed computations, and ACG is roughly 100x faster than a similar, recombination-aware program. Conclusions Modeling the ancestry of the sequences with an ARG allows ACG to estimate the evolutionary history of recombining nucleotide sequences. ACG can accurately estimate the posterior distribution of population parameters such as the (scaled population size and recombination rate, as well as many aspects of the recombinant history, including the positions of recombination breakpoints, the distribution of time to most recent common ancestor along the sequence, and the non-recombining trees at individual sites. Multiple substitution models and population size models are provided. ACG also provides a richly informative graphical interface that allows users to view the evolution of model parameters and likelihoods in real time.

  19. Bayesian versus 'plain-vanilla Bayesian' multitarget statistics

    Science.gov (United States)

    Mahler, Ronald P. S.

    2004-08-01

    Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."

  20. Reconstructing Galaxy Histories from Globular Clusters

    CERN Document Server

    West, M J; Marzke, R O; Jordan, A; West, Michael J.; Cote, Patrick; Marzke, Ronald O.; Jordan, Andres

    2004-01-01

    Nearly a century after the true nature of galaxies as distant "island universes" was established, their origin and evolution remain great unsolved problems of modern astrophysics. One of the most promising ways to investigate galaxy formation is to study the ubiquitous globular star clusters that surround most galaxies. Recent advances in our understanding of the globular cluster systems of the Milky Way and other galaxies point to a complex picture of galaxy genesis driven by cannibalism, collisions, bursts of star formation and other tumultuous events.

  1. Reconstructing the Limfjord’s history

    DEFF Research Database (Denmark)

    Philippsen, Bente

    is the determination of the radiocarbon reservoir effect. A reservoir effect occurs when a sample’s carbon derives from a reservoir with a different, typically lower, 14C concentration than the atmosphere. Samples that initially contain less 14C than contemporaneous samples in equilibrium with the atmosphere...

  2. Bayesian inference on proportional elections.

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  3. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  4. Bayesian priors for transiting planets

    CERN Document Server

    Kipping, David M

    2016-01-01

    As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...

  5. Bayesian Source Separation and Localization

    CERN Document Server

    Knuth, K H

    1998-01-01

    The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...

  6. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  7. A Bayesian Nonparametric IRT Model

    OpenAIRE

    Karabatsos, George

    2015-01-01

    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  8. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  9. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  10. Bayesian Stable Isotope Mixing Models

    OpenAIRE

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard

    2012-01-01

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  11. Bayesian Network--Response Regression

    OpenAIRE

    WANG, LU; Durante, Daniele; Dunson, David B.

    2016-01-01

    There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...

  12. Bayesian segmentation of hyperspectral images

    CERN Document Server

    Mohammadpour, Adel; Mohammad-Djafari, Ali

    2007-01-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  13. Bayesian segmentation of hyperspectral images

    Science.gov (United States)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  14. Bayesian analysis of contingency tables

    OpenAIRE

    Gómez Villegas, Miguel A.; González Pérez, Beatriz

    2005-01-01

    The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...

  15. Bayesian estimation of turbulent motion

    OpenAIRE

    Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni

    2013-01-01

    International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...

  16. Bayesian Kernel Mixtures for Counts

    OpenAIRE

    Canale, Antonio; David B Dunson

    2011-01-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...

  17. Bayesian second law of thermodynamics.

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241

  18. Bayesian second law of thermodynamics

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  19. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  20. Bayesian coestimation of phylogeny and sequence alignment

    Directory of Open Access Journals (Sweden)

    Jensen Jens

    2005-04-01

    Full Text Available Abstract Background Two central problems in computational biology are the determination of the alignment and phylogeny of a set of biological sequences. The traditional approach to this problem is to first build a multiple alignment of these sequences, followed by a phylogenetic reconstruction step based on this multiple alignment. However, alignment and phylogenetic inference are fundamentally interdependent, and ignoring this fact leads to biased and overconfident estimations. Whether the main interest be in sequence alignment or phylogeny, a major goal of computational biology is the co-estimation of both. Results We developed a fully Bayesian Markov chain Monte Carlo method for coestimating phylogeny and sequence alignment, under the Thorne-Kishino-Felsenstein model of substitution and single nucleotide insertion-deletion (indel events. In our earlier work, we introduced a novel and efficient algorithm, termed the "indel peeling algorithm", which includes indels as phylogenetically informative evolutionary events, and resembles Felsenstein's peeling algorithm for substitutions on a phylogenetic tree. For a fixed alignment, our extension analytically integrates out both substitution and indel events within a proper statistical model, without the need for data augmentation at internal tree nodes, allowing for efficient sampling of tree topologies and edge lengths. To additionally sample multiple alignments, we here introduce an efficient partial Metropolized independence sampler for alignments, and combine these two algorithms into a fully Bayesian co-estimation procedure for the alignment and phylogeny problem. Our approach results in estimates for the posterior distribution of evolutionary rate parameters, for the maximum a-posteriori (MAP phylogenetic tree, and for the posterior decoding alignment. Estimates for the evolutionary tree and multiple alignment are augmented with confidence estimates for each node height and alignment column

  1. Project Reconstruct.

    Science.gov (United States)

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  2. Vaginal reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Lesavoy, M.A.

    1985-05-01

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients.

  3. Vaginal reconstruction

    International Nuclear Information System (INIS)

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients

  4. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  5. Bayesian Posterior Distributions Without Markov Chains

    OpenAIRE

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.

    2012-01-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...

  6. Bayesian probabilities of earthquake occurrences in Longmenshan fault system (China)

    Science.gov (United States)

    Wang, Ying; Zhang, Keyin; Gan, Qigang; Zhou, Wen; Xiong, Liang; Zhang, Shihua; Liu, Chao

    2015-01-01

    China has a long history of earthquake records, and the Longmenshan fault system (LFS) is a famous earthquake zone. We believed that the LFS could be divided into three seismogenic zones (north, central, and south zones) based on the geological structures and the earthquake catalog. We applied the Bayesian probability method using extreme-value distribution of earthquake occurrences to estimate the seismic hazard in the LFS. The seismic moment, slip rate, earthquake recurrence rate, and magnitude were considered as the basic parameters for computing the Bayesian prior estimates of the seismicity. These estimates were then updated in terms of Bayes' theorem and historical estimates of seismicity in the LFS. Generally speaking, the north zone seemingly is quite peaceful compared with the central and south zones. The central zone is the most dangerous; however, the periodicity of earthquake occurrences for M s = 8.0 is quite long (1,250 to 5,000 years). The selection of upper bound probable magnitude influences the result, and the upper bound magnitude of the south zone maybe 7.5. We obtained the empirical relationship of magnitude conversion for M s and ML, the values of the magnitude of completeness Mc (3.5), and the Gutenberg-Richter b value before applying the Bayesian extreme-value distribution of earthquake occurrences method.

  7. Variational bayesian method of estimating variance components.

    Science.gov (United States)

    Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi

    2016-07-01

    We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.

  8. SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE

    Institute of Scientific and Technical Information of China (English)

    Ming HAN; Yuanyao DING

    2004-01-01

    This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.

  9. ECG gated tomographic reconstruction for 3-D rotational coronary angiography

    Science.gov (United States)

    Hu, Yining; Xie, Lizhe; Nunes, Jean Claude; Bellanger, Jean Jacques; Bedossa, Marc; Toumoulin, Christine

    2010-01-01

    A method is proposed for 3-D reconstruction of coronary from a limited number of projections in rotational angiography. A Bayesian maximum a posteriori (MAP) estimation is applied with a Poisson distributed projection to reconstruct the 3D coronary tree at a given instant of the cardiac cycle. Several regularizers are investigated L0-norm, L1 and L2 -norm in order to take into account the sparsity of the data. Evaluations are reported on simulated data obtained from a 3D dynamic sequence acquired on a 64-slice GE LightSpeed CT scan. A performance study is conducted to evaluate the quality of the reconstruction of the structures. PMID:21096844

  10. Denoising Message Passing for X-ray Computed Tomography Reconstruction

    CERN Document Server

    Perelli, Alessandro; Can, Ali; Davies, Mike E

    2016-01-01

    X-ray Computed Tomography (CT) reconstruction from sparse number of views is becoming a powerful way to reduce either the radiation dose or the acquisition time in CT systems but still requires a huge computational time. This paper introduces an approximate Bayesian inference framework for CT reconstruction based on a family of denoising approximate message passing (DCT-AMP) algorithms able to improve both the convergence speed and the reconstruction quality. Approximate Message Passing for Compressed Sensing has been extensively analysed for random linear measurements but there are still not clear solutions on how AMP should be modified and how it performs with real world problems. In particular to overcome the convergence issues of DCT-AMP with structured measurement matrices, we propose a disjoint preconditioned version of the algorithm tailored for both the geometric system model and the noise model. In addition the Bayesian DCT-AMP formulation allows to measure how the current estimate is close to the pr...

  11. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  12. Radio Reconstructions

    OpenAIRE

    Bulley, James; Jones, Daniel

    2013-01-01

    Radio Reconstructions is a sound installation which use indeterminate radio broadcasts as its raw material. Each piece is structured by a notated score, which controls its rhythm, dynamics and melodic contour over time. The audio elements used to enact this score are selected in real-time from unknown radio transmissions, by an autonomous software system which is continuously scanning the radio waves in search of similar fragments of audio. Using a technique known as audio mosaicing, hund...

  13. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  14. Bayesian Query-Focused Summarization

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.

  15. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  16. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    Directory of Open Access Journals (Sweden)

    Benjamin W. Y. Lo

    2013-01-01

    Full Text Available Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH. Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients. Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs. Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  17. 苏北盆地金湖凹陷热史与成藏期判识%Thermal History Reconstruction and Hydrocarbon Accumulation Period Discrimination of Jinhu Depression in Subei Basin

    Institute of Scientific and Technical Information of China (English)

    李亚军; 李儒峰; 陈莉琼; 宋宁; 方晶

    2011-01-01

    Based on the analysis of the vitrinite reflectance and apatite fission track inclusions system testing, we carried out the calculation of paleotemperature gradient and reconstruction of thermal history , and then identified the paleotemperature gradient of west slope and Bianminyang tectonic zone of Jinhu depression. According to the vitrinite reflectance, we calculated that the range of the paleotemperature being between 45.6 ~128.4℃ and the paleotemperature gradient was 45.5 ℃/km in the west slope, the paleotemperature in Bianminyang tectonic zone was 26.4 ~120.3℃ and the paleotemperature gradient was 42.7℃/km. According to the apatite fission track, we calculated that the paleotemperature gradient in the west slope was 40.7℃/km, and in Bianminyang tectonic zone was 45.8℃/km.From the comparative analysis with different tectonic zones of Jinhu depression, we concluded a law that the paleo temperature gradient was higher than present-day geothermal gradient, specifically as follows: in the west slope, paleotemperature was 10.4 ~ 15.2℃/kin higher than the current, and in Bianminyang tectonic zone paleotemperature was 12.4 ~ 15.3℃/km higher than the present. By the thermal history modeling of typical wells in west slope and Bianminyang tectonic zone, it could be seen that the paleo-geothermal gradient became lower with the stratigraphical time changed for the new. It shows that the geothermal gradient of K2t ~ E1fwas higher than E2d ~ Ny. Before the uplift and erosion caused by the Sanduo tectonic events, the paleo-temperature of depression had reached the maximum. The maturity history of depression reflected that the R0 was 0. 4% in the depth of 1 000 m of Jinhu depression. The source rock was at the low-mature stage. The R0 was 0.65% in the depth of 1 900 m and the temperature reached 90℃, the hydrocarbon source rocks entered the peak phase. The homogenization temperature of fluid inclusion samples was between 62 ~ 93℃ of Well

  18. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  19. Bayesian analysis of exoplanet and binary orbits

    OpenAIRE

    Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  20. Bayesian credible interval construction for Poisson statistics

    Institute of Scientific and Technical Information of China (English)

    ZHU Yong-Sheng

    2008-01-01

    The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.

  1. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  2. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  3. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...

  4. The Bayesian Revolution Approaches Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  5. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    W.P. Zajdel; B.J.A. Kröse

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ

  6. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  7. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.

  8. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data. PMID:26776199

  9. Intellectual History, Social History, Cultural History...and Our History.

    Science.gov (United States)

    Nord, David Paul

    1990-01-01

    Defines and explores the links among intellectual, social, and cultural history. Warns that an adequate foundation must be laid in the economic and institutional social history of mass media before communication historians jump into cultural history. (SR)

  10. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  11. Afghanistan Reconstruction

    Institute of Scientific and Technical Information of China (English)

    Fu Xiaoqiang

    2006-01-01

    @@ The Karzai regime has made some progress over the past four years and a half in the post-war reconstruction.However, Taliban's destruction and drug economy are still having serious impacts on the security and stability of Afghanistan.Hence the settlement of the two problems has become a crux of affecting the country' s future.Moreover, the Karzai regime is yet to handle a series of hot potatoes in the fields of central government' s authority, military and police building-up and foreign relations as well.

  12. Group Tracking of Space Objects within Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Huang Jian

    2013-03-01

    Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.

  13. Tracheal reconstructions.

    Science.gov (United States)

    Srikrishna, S V; Shekar, P S; Shetty, N

    1998-12-01

    Surgical reconstruction of the trachea is a relatively complex procedure. We had 20 cases of tracheal stenosis. We have a modest experience of 16 tracheal reconstructions for acquired tracheal stenosis. Two patients underwent laser treatment while another two died before any intervention. The majority of these cases were a result of prolonged ventilation (14 cases), following organophosphorous poisoning (11 cases), Guillain-Barré syndrome, bullet injury, fat embolism and surprisingly only one tumor, a case of mucoepidermoid carcinoma, who had a very unusual presentation. There were 12 males and 4 females in this series, age ranging from 12-35 years. The duration of ventilation ranged from 1-21 days and the interval from decannulation to development of stridor was between 5-34 days. Six of them were approached by the cervical route, 5 by thoracotomy and cervical approach, 2 via median sternotomy and 3 by thoracotomy alone. Five of them required an additional laryngeal drop and 1 required pericardiotomy and release of pulmonary veins to gain additional length. The excised segments of trachea measured 3 to 5 cms in length. All were end to end anastomosis with interrupted Vicryl sutures. We have had no experience with stents or prosthetic tubes. Three patients developed anastomotic leaks which were controlled conservatively. Almost all of them required postoperative tracheo-bronchial suctioning with fibreoptic bronchoscope. We had one death in this series due to sepsis. PMID:9914459

  14. Bayesian redshift-space distortions correction from galaxy redshift surveys

    CERN Document Server

    Kitaura, Francisco-Shu; Angulo, Raul E; Chuang, Chia-Hsun; Rodriguez-Torres, Sergio; Monteagudo, Carlos Hernandez; Prada, Francisco; Yepes, Gustavo

    2015-01-01

    We present a Bayesian reconstruction method which maps a galaxy distribution from redshift-space to real-space inferring the distances of the individual galaxies. The method is based on sampling density fields assuming a lognormal prior with a likelihood given by the negative binomial distribution function modelling stochastic bias. We assume a deterministic bias given by a power law relating the dark matter density field to the expected halo or galaxy field. Coherent redshift-space distortions are corrected in a Gibbs-sampling procedure by moving the galaxies from redshift-space to real-space according to the peculiar motions derived from the recovered density field using linear theory with the option to include tidal field corrections from second order Lagrangian perturbation theory. The virialised distortions are corrected by sampling candidate real-space positions (being in the neighbourhood of the observations along the line of sight), which are compatible with the bulk flow corrected redshift-space posi...

  15. Bohmian Histories and Decoherent Histories

    OpenAIRE

    Hartle, James B.

    2002-01-01

    The predictions of the Bohmian and the decoherent (or consistent) histories formulations of the quantum mechanics of a closed system are compared for histories -- sequences of alternatives at a series of times. For certain kinds of histories, Bohmian mechanics and decoherent histories may both be formulated in the same mathematical framework within which they can be compared. In that framework, Bohmian mechanics and decoherent histories represent a given history by different operators. Their ...

  16. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  17. Quantum Bayesianism at the Perimeter

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.

  18. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  19. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    , and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...

  20. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented on model construction and verification, modeling techniques and tricks, learning......Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...... sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning...

  1. State Information in Bayesian Games

    CERN Document Server

    Cuff, Paul

    2009-01-01

    Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.

  2. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  3. Bayesian anti-sparse coding

    CERN Document Server

    Elvira, Clément; Dobigeon, Nicolas

    2015-01-01

    Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...

  4. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  5. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437

  6. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  7. Bayesian and maximum likelihood phylogenetic analyses of protein sequence data under relative branch-length differences and model violation

    Directory of Open Access Journals (Sweden)

    Harlow Timothy J

    2005-01-01

    Full Text Available Abstract Background Bayesian phylogenetic inference holds promise as an alternative to maximum likelihood, particularly for large molecular-sequence data sets. We have investigated the performance of Bayesian inference with empirical and simulated protein-sequence data under conditions of relative branch-length differences and model violation. Results With empirical protein-sequence data, Bayesian posterior probabilities provide more-generous estimates of subtree reliability than does the nonparametric bootstrap combined with maximum likelihood inference, reaching 100% posterior probability at bootstrap proportions around 80%. With simulated 7-taxon protein-sequence datasets, Bayesian posterior probabilities are somewhat more generous than bootstrap proportions, but do not saturate. Compared with likelihood, Bayesian phylogenetic inference can be as or more robust to relative branch-length differences for datasets of this size, particularly when among-sites rate variation is modeled using a gamma distribution. When the (known correct model was used to infer trees, Bayesian inference recovered the (known correct tree in 100% of instances in which one or two branches were up to 20-fold longer than the others. At ratios more extreme than 20-fold, topological accuracy of reconstruction degraded only slowly when only one branch was of relatively greater length, but more rapidly when there were two such branches. Under an incorrect model of sequence change, inaccurate trees were sometimes observed at less extreme branch-length ratios, and (particularly for trees with single long branches such trees tended to be more inaccurate. The effect of model violation on accuracy of reconstruction for trees with two long branches was more variable, but gamma-corrected Bayesian inference nonetheless yielded more-accurate trees than did either maximum likelihood or uncorrected Bayesian inference across the range of conditions we examined. Assuming an exponential

  8. Divergence history of the Carpathian and smooth newts modelled in space and time.

    Science.gov (United States)

    Zieliński, P; Nadachowska-Brzyska, K; Dudek, K; Babik, W

    2016-08-01

    Information about demographic history is essential for the understanding of the processes of divergence and speciation. Patterns of genetic variation within and between closely related species provide insights into the history of their interactions. Here, we investigated historical demography and genetic exchange between the Carpathian (Lissotriton montandoni, Lm) and smooth (L. vulgaris, Lv) newts. We combine an extensive geographical sampling and multilocus nuclear sequence data with the approximate Bayesian computation framework to test alternative scenarios of divergence and reconstruct the temporal and spatial pattern of gene flow between species. A model of recent (last glacial period) interspecific gene flow was favoured over alternative models. Thus, despite the relatively old divergence (4-6 mya) and presumably long periods of isolation, the species have retained the ability to exchange genes. Nevertheless, the low migration rates (ca. 10(-6) per gene copy per generation) are consistent with strong reproductive isolation between the species. Models allowing demographic changes were favoured, suggesting that the effective population sizes of both species at least doubled as divergence reaching the current ca. 0.2 million in Lm and 1 million in Lv. We found asymmetry in rates of interspecific gene flow between Lm and one evolutionary lineage of Lv. We suggest that intraspecific polymorphism for hybrid incompatibilities segregating within Lv could explain this pattern and propose further tests to distinguish between alternative explanations. Our study highlights the importance of incorporating intraspecific genetic structure into the models investigating the history of divergence.

  9. Divergence history of the Carpathian and smooth newts modelled in space and time.

    Science.gov (United States)

    Zieliński, P; Nadachowska-Brzyska, K; Dudek, K; Babik, W

    2016-08-01

    Information about demographic history is essential for the understanding of the processes of divergence and speciation. Patterns of genetic variation within and between closely related species provide insights into the history of their interactions. Here, we investigated historical demography and genetic exchange between the Carpathian (Lissotriton montandoni, Lm) and smooth (L. vulgaris, Lv) newts. We combine an extensive geographical sampling and multilocus nuclear sequence data with the approximate Bayesian computation framework to test alternative scenarios of divergence and reconstruct the temporal and spatial pattern of gene flow between species. A model of recent (last glacial period) interspecific gene flow was favoured over alternative models. Thus, despite the relatively old divergence (4-6 mya) and presumably long periods of isolation, the species have retained the ability to exchange genes. Nevertheless, the low migration rates (ca. 10(-6) per gene copy per generation) are consistent with strong reproductive isolation between the species. Models allowing demographic changes were favoured, suggesting that the effective population sizes of both species at least doubled as divergence reaching the current ca. 0.2 million in Lm and 1 million in Lv. We found asymmetry in rates of interspecific gene flow between Lm and one evolutionary lineage of Lv. We suggest that intraspecific polymorphism for hybrid incompatibilities segregating within Lv could explain this pattern and propose further tests to distinguish between alternative explanations. Our study highlights the importance of incorporating intraspecific genetic structure into the models investigating the history of divergence. PMID:27288862

  10. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  11. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  12. Tracking composite material damage evolution using Bayesian filtering and flash thermography data

    Science.gov (United States)

    Gregory, Elizabeth D.; Holland, Steve D.

    2016-05-01

    We propose a method for tracking the condition of a composite part using Bayesian filtering of ash thermography data over the lifetime of the part. In this demonstration, composite panels were fabricated; impacted to induce subsurface delaminations; and loaded in compression over multiple time steps, causing the delaminations to grow in size. Flash thermography data was collected between each damage event to serve as a time history of the part. The ash thermography indicated some areas of damage but provided little additional information as to the exact nature or depth of the damage. Computed tomography (CT) data was also collected after each damage event and provided a high resolution volume model of damage that acted as truth. After each cycle, the condition estimate, from the ash thermography data and the Bayesian filter, was compared to 'ground truth'. The Bayesian process builds on the lifetime history of ash thermography scans and can give better estimates of material condition as compared to the most recent scan alone, which is common practice in the aerospace industry. Bayesian inference provides probabilistic estimates of damage condition that are updated as each new set of data becomes available. The method was tested on simulated data and then on an experimental data set.

  13. The Diagnosis of Reciprocating Machinery by Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.

  14. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  15. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  16. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  17. Bayesian Control for Concentrating Mixed Nuclear Waste

    OpenAIRE

    Welch, Robert L.; Smith, Clayton

    2013-01-01

    A control algorithm for batch processing of mixed waste is proposed based on conditional Gaussian Bayesian networks. The network is compiled during batch staging for real-time response to sensor input.

  18. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  19. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  20. Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods

    Science.gov (United States)

    Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.

    2012-03-01

    In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.

  1. Bayesian approach to inverse problems for functions with a variable-index Besov prior

    Science.gov (United States)

    Jia, Junxiong; Peng, Jigen; Gao, Jinghuai

    2016-08-01

    The Bayesian approach has been adopted to solve inverse problems that reconstruct a function from noisy observations. Prior measures play a key role in the Bayesian method. Hence, many probability measures have been proposed, among which total variation (TV) is a well-known prior measure that can preserve sharp edges. However, it has two drawbacks, the staircasing effect and a lack of the discretization-invariant property. The variable-index TV prior has been proposed and analyzed in the area of image analysis for the former, and the Besov prior has been employed recently for the latter. To overcome both issues together, in this paper, we present a variable-index Besov prior measure, which is a non-Gaussian measure. Some useful properties of this new prior measure have been proven for functions defined on a torus. We have also generalized Bayesian inverse theory in infinite dimensions for our new setting. Finally, this theory has been applied to integer- and fractional-order backward diffusion problems. To the best of our knowledge, this is the first time that the Bayesian approach has been used for the fractional-order backward diffusion problem, which provides an opportunity to quantify its uncertainties.

  2. CURRENT CONCEPTS IN ACL RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Freddie H. Fu

    2008-09-01

    Full Text Available Current Concepts in ACL Reconstruction is a complete reference text composed of the most thorough collection of topics on the ACL and its surgical reconstruction compiled, with contributions from some of the world's experts and most experienced ACL surgeons. Various procedures mentioned throughout the text are also demonstrated in an accompanying video CD-ROM. PURPOSE Composing a single, comprehensive and complete information source on ACL including basic sciences, clinical issues, latest concepts and surgical techniques, from evaluation to outcome, from history to future, editors and contributors have targeted to keep the audience pace with the latest concepts and techniques for the evaluation and the treatment of ACL injuries. FEATURES The text is composed of 27 chapters in 6 sections. The first section is mostly about basic sciences, also history of the ACL, imaging, clinical approach to adolescent and pediatric patients are subjected. In the second section, Graft Choices and Arthroscopy Portals for ACL Reconstruction are mentioned. The third section is about the technique and the outcome of the single-bundle ACL reconstruction. The fourth chapter includes the techniques and outcome of the double-bundle ACL reconstruction. In the fifth chapter revision, navigation technology, rehabilitation and the evaluation of the outcome of ACL reconstruction is subjected. The sixth/the last chapter is about the future advances to reach: What We Have Learned and the Future of ACL Reconstruction. AUDIENCE Orthopedic residents, sports traumatology and knee surgery fellows, orthopedic surgeons, also scientists in basic sciences or clinicians who are studying or planning a research on ACL forms the audience group of this book. ASSESSMENT This is the latest, the most complete and comprehensive textbook of ACL reconstruction produced by the editorial work up of two pioneer and masters "Freddie H. Fu MD and Steven B. Cohen MD" with the contribution of world

  3. Nomograms for Visualization of Naive Bayesian Classifier

    OpenAIRE

    Možina, Martin; Demšar, Janez; Michael W Kattan; Zupan, Blaz

    2004-01-01

    Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the proposed method are simplicity of presentation, clear display of the effects of individual attribute value...

  4. Subjective Bayesian Analysis: Principles and Practice

    OpenAIRE

    Goldstein, Michael

    2006-01-01

    We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.

  5. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  6. Fitness inheritance in the Bayesian optimization algorithm

    OpenAIRE

    Pelikan, Martin; Sastry, Kumara

    2004-01-01

    This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions...

  7. Kernel Bayesian Inference with Posterior Regularization

    OpenAIRE

    Song, Yang; Jun ZHU; Ren, Yong

    2016-01-01

    We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...

  8. Bayesian Classification in Medicine: The Transferability Question *

    OpenAIRE

    Zagoria, Ronald J.; Reggia, James A.; Price, Thomas R.; Banko, Maryann

    1981-01-01

    Using probabilities derived from a geographically distant patient population, we applied Bayesian classification to categorize stroke patients by etiology. Performance was assessed both by error rate and with a new linear accuracy coefficient. This approach to patient classification was found to be surprisingly accurate when compared to classification by two neurologists and to classification by the Bayesian method using “low cost” local and subjective probabilities. We conclude that for some...

  9. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  10. Bayesian Variable Selection in Spatial Autoregressive Models

    OpenAIRE

    Jesus Crespo Cuaresma; Philipp Piribauer

    2015-01-01

    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  11. Fuzzy Functional Dependencies and Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    LIU WeiYi(刘惟一); SONG Ning(宋宁)

    2003-01-01

    Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.

  12. Bayesian Models of Brain and Behaviour

    OpenAIRE

    Penny, William

    2012-01-01

    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  13. Bayesian Modeling of a Human MMORPG Player

    CERN Document Server

    Synnaeve, Gabriel

    2010-01-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  14. Bayesian Modeling of a Human MMORPG Player

    Science.gov (United States)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  15. Computational Imaging for VLBI Image Reconstruction

    CERN Document Server

    Bouman, Katherine L; Zoran, Daniel; Fish, Vincent L; Doeleman, Sheperd S; Freeman, William T

    2015-01-01

    Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other methods require careful tuning and parameter selection for different types of images, our method is robust and produces good results under different settings such as low SNR or extended emissions. The success of our method is demonstrated on realistic synthetic experiments as well as publicly available real data. We present this problem in a way that is accessible to members of the computer vision community, and provide a dataset website (vlbiimaging.csail.mit.edu) to allow for controlled comparisons across algorithms. Thi...

  16. Quantum State Reconstruction From Incomplete Data

    CERN Document Server

    Buzek, V; Derka, R; Adam, G; Wiedemann, H

    1998-01-01

    Knowing and guessing, these are two essential epistemological pillars in the theory of quantum-mechanical measurement. As formulated quantum mechanics is a statistical theory. In general, a priori unknown states can be completely determined only when measurements on infinite ensembles of identically prepared quantum systems are performed. But how one can estimate (guess) quantum state when just incomplete data are available (known)? What is the most reliable estimation based on a given measured data? What is the optimal measurement providing only a finite number of identically prepared quantum objects are available? These are some of the questions we address. We present several schemes for a reconstruction of states of quantum systems from measured data: (1) We show how the maximum entropy (MaxEnt) principle can be efficiently used for an estimation of quantum states on incomplete observation levels. (2) We show how Bayesian inference can be used for reconstruction of quantum states when only a finite number ...

  17. Reconstructive Urology

    Directory of Open Access Journals (Sweden)

    Fikret Fatih Önol

    2014-11-01

    Full Text Available In the treatment of urethral stricture, Buccal Mucosa Graft (BMG and reconstruction is applied with different patch techniques. Recently often prefered, this approach is, in bulber urethra strictures of BMG’s; by “ventral onley”, in pendulous urethra because of thinner spingiosis body, which provides support and nutrition of graft; by means of “dorsal inley” being anastomosis. In the research that Cordon et al. did, they compared conventional BMJ “onley” urethroplast and “pseudo-spongioplasty” which base on periurethral vascular tissues to be nourished by closing onto graft. In repairment of front urethras that spongiosis supportive tissue is insufficient, this method is defined as peripheral dartos [çevre dartos?] and buck’s fascia being mobilized and being combined on BMG patch. Between the years 2007 and 2012, assessment of 56 patients with conventional “ventral onley” BMG urethroplast and 46 patients with “pseudo-spongioplasty” were reported to have similar success rates (80% to 84% in 3.5 year follow-up on average. While 74% of the patients that were applied pseudo-spongioplasty had disease present at distal urethra (pendulous, bulbopendulous, 82% of the patients which were applied conventional onley urethroplast had stricture at proximal (bulber urethra yet. Also lenght of the stricture at the pseudo-spongioplasty group was longer in a statistically significant way (5.8 cm to 4.7 cm on average, p=0.028. This study which Cordon et al. did, shows that conditions in which conventional sponjiyoplasti is not possible, periurethral vascular tissues are adequate to nourish BMG. Even it is an important technique in terms of bringing a new point of view to today’s practice, data especially about complications that may show up after pseudo-spongioplasty usage on long distal strictures (e.g. appearance of urethral diverticulum is not reported. Along with this we think that, providing an oppurtinity to patch directly

  18. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  19. Iterative PET Image Reconstruction Using Translation Invariant Wavelet Transform.

    OpenAIRE

    Zhou, Jian; Senhadji, Lotfi; Coatrieux, Jean-Louis; Luo, Limin

    2009-01-01

    The present work describes a Bayesian maximum a posteriori (MAP) method using a statistical multiscale wavelet prior model. Rather than using the orthogonal discrete wavelet transform (DWT), this prior is built on the translation invariant wavelet transform (TIWT). The statistical modeling of wavelet coefficients relies on the generalized Gaussian distribution. Image reconstruction is performed in spatial domain with a fast block sequential iteration algorithm. We study theoretically the TIWT...

  20. New insights into the hepatitis E virus genotype 3 phylodynamics and evolutionary history.

    Science.gov (United States)

    Mirazo, Santiago; Mir, Daiana; Bello, Gonzalo; Ramos, Natalia; Musto, Héctor; Arbiza, Juan

    2016-09-01

    Hepatitis E virus (HEV) is an emergent hepatotropic virus endemic mainly in Asia and other developing areas. However, in the last decade it has been increasingly reported in high-income countries. Human infecting HEV strains are currently classified into four genotypes (1-4). Genotype 3 (HEV-3) is the prevalent virus genotype and the mostly associated with autochthonous and sporadic cases of HEV in developed areas. The evolutionary history of HEV worldwide remains largely unknown. In this study we reconstructed the spatiotemporal and population dynamics of HEV-3 at global scale, but with particular emphasis in South America, where case reports have increased dramatically in the last years. To achieve this, we applied a Bayesian coalescent-based approach to a comprehensive data set comprising 97 GenBank HEV-3 sequences for which the location and sampling date was documented. Our phylogenetic analyses suggest that the worldwide genetic diversity of HEV-3 can be grouped into two main Clades (I and II) with a Ƭmrca dated in approximately 320years ago (95% HPD: 420-236years) and that a unique independent introduction of HEV-3 seems to have occurred in Uruguay, where most of the human HEV cases in South America have been described. The phylodynamic inference indicates that the population size of this virus suffered substantial temporal variations after the second half of the 20th century. In this sense and conversely to what is postulated to date, we suggest that the worldwide effective population size of HEV-3 is not decreasing and that frequently sources of error in its estimates stem from assumptions that the analyzed sequences are derived from a single panmictic population. Novel insights on the global population dynamics of HEV are given. Additionally, this work constitutes an attempt to further describe in a Bayesian coalescent framework, the phylodynamics and evolutionary history of HEV-3 in the South American region. PMID:27264728

  1. Mid-Holocene decline in African buffalos inferred from Bayesian coalescence-based analyses of microsatellites and mitochondrial DNA

    DEFF Research Database (Denmark)

    Heller, Rasmus; Lorenzen, Eline D.; Okello, J.B.A;

    2008-01-01

    pandemic in the late 1800s, but little is known about the earlier demographic history of the species. We analysed genetic variation at 17 microsatellite loci and a 302-bp fragment of the mitochondrial DNA control region to infer past demographic changes in buffalo populations from East Africa. Two Bayesian...

  2. Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements

    Science.gov (United States)

    Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.

    2016-04-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.

  3. Bayesian redshift-space distortions correction from galaxy redshift surveys

    Science.gov (United States)

    Kitaura, Francisco-Shu; Ata, Metin; Angulo, Raul E.; Chuang, Chia-Hsun; Rodríguez-Torres, Sergio; Monteagudo, Carlos Hernández; Prada, Francisco; Yepes, Gustavo

    2016-03-01

    We present a Bayesian reconstruction method which maps a galaxy distribution from redshift- to real-space inferring the distances of the individual galaxies. The method is based on sampling density fields assuming a lognormal prior with a likelihood modelling non-linear stochastic bias. Coherent redshift-space distortions are corrected in a Gibbs-sampling procedure by moving the galaxies from redshift- to real-space according to the peculiar motions derived from the recovered density field using linear theory. The virialized distortions are corrected by sampling candidate real-space positions along the line of sight, which are compatible with the bulk flow corrected redshift-space position adding a random dispersion term in high-density collapsed regions (defined by the eigenvalues of the Hessian). This approach presents an alternative method to estimate the distances to galaxies using the three-dimensional spatial information, and assuming isotropy. Hence the number of applications is very broad. In this work, we show the potential of this method to constrain the growth rate up to k ˜ 0.3 h Mpc-1. Furthermore it could be useful to correct for photometric redshift errors, and to obtain improved baryon acoustic oscillations (BAO) reconstructions.

  4. Bayesian inference for OPC modeling

    Science.gov (United States)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  5. Bayesian analysis of volcanic eruptions

    Science.gov (United States)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  6. BAYESIAN APPROACH OF DECISION PROBLEMS

    Directory of Open Access Journals (Sweden)

    DRAGOŞ STUPARU

    2010-01-01

    Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.

  7. Bayesian Inference of Reticulate Phylogenies under the Multispecies Network Coalescent.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Nakhleh, Luay

    2016-05-01

    The multispecies coalescent (MSC) is a statistical framework that models how gene genealogies grow within the branches of a species tree. The field of computational phylogenetics has witnessed an explosion in the development of methods for species tree inference under MSC, owing mainly to the accumulating evidence of incomplete lineage sorting in phylogenomic analyses. However, the evolutionary history of a set of genomes, or species, could be reticulate due to the occurrence of evolutionary processes such as hybridization or horizontal gene transfer. We report on a novel method for Bayesian inference of genome and species phylogenies under the multispecies network coalescent (MSNC). This framework models gene evolution within the branches of a phylogenetic network, thus incorporating reticulate evolutionary processes, such as hybridization, in addition to incomplete lineage sorting. As phylogenetic networks with different numbers of reticulation events correspond to points of different dimensions in the space of models, we devise a reversible-jump Markov chain Monte Carlo (RJMCMC) technique for sampling the posterior distribution of phylogenetic networks under MSNC. We implemented the methods in the publicly available, open-source software package PhyloNet and studied their performance on simulated and biological data. The work extends the reach of Bayesian inference to phylogenetic networks and enables new evolutionary analyses that account for reticulation. PMID:27144273

  8. Analysing uncertainties: Towards comparing Bayesian and interval probabilities'

    Science.gov (United States)

    Blockley, David

    2013-05-01

    Two assumptions, commonly made in risk and reliability studies, have a long history. The first is that uncertainty is either aleatoric or epistemic. The second is that standard probability theory is sufficient to express uncertainty. The purposes of this paper are to provide a conceptual analysis of uncertainty and to compare Bayesian approaches with interval approaches with an example relevant to research on climate change. The analysis reveals that the categorisation of uncertainty as either aleatoric or epistemic is unsatisfactory for practical decision making. It is argued that uncertainty emerges from three conceptually distinctive and orthogonal attributes FIR i.e., fuzziness, incompleteness (epistemic) and randomness (aleatory). Characterisations of uncertainty, such as ambiguity, dubiety and conflict, are complex mixes of interactions in an FIR space. To manage future risks in complex systems it will be important to recognise the extent to which we 'don't know' about possible unintended and unwanted consequences or unknown-unknowns. In this way we may be more alert to unexpected hazards. The Bayesian approach is compared with an interval probability approach to show one way in which conflict due to incomplete information can be managed.

  9. Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST

    Science.gov (United States)

    Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.

    2013-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.

  10. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    Science.gov (United States)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  11. Bayesian predictive modeling for genomic based personalized treatment selection.

    Science.gov (United States)

    Ma, Junsheng; Stingo, Francesco C; Hobbs, Brian P

    2016-06-01

    Efforts to personalize medicine in oncology have been limited by reductive characterizations of the intrinsically complex underlying biological phenomena. Future advances in personalized medicine will rely on molecular signatures that derive from synthesis of multifarious interdependent molecular quantities requiring robust quantitative methods. However, highly parameterized statistical models when applied in these settings often require a prohibitively large database and are sensitive to proper characterizations of the treatment-by-covariate interactions, which in practice are difficult to specify and may be limited by generalized linear models. In this article, we present a Bayesian predictive framework that enables the integration of a high-dimensional set of genomic features with clinical responses and treatment histories of historical patients, providing a probabilistic basis for using the clinical and molecular information to personalize therapy for future patients. Our work represents one of the first attempts to define personalized treatment assignment rules based on large-scale genomic data. We use actual gene expression data acquired from The Cancer Genome Atlas in the settings of leukemia and glioma to explore the statistical properties of our proposed Bayesian approach for personalizing treatment selection. The method is shown to yield considerable improvements in predictive accuracy when compared to penalized regression approaches. PMID:26575856

  12. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  13. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  14. Dengue on islands: a Bayesian approach to understanding the global ecology of dengue viruses

    OpenAIRE

    Feldstein, Leora R.; John S Brownstein; Brady, Oliver J.; Simon I Hay; Johansson, Michael A.

    2015-01-01

    Background: Transmission of dengue viruses (DENV), the most common arboviral pathogens globally, is influenced by many climatic and socioeconomic factors. However, the relative contributions of these factors on a global scale are unclear. Methods: We randomly selected 94 islands stratified by socioeconomic and geographic characteristics. With a Bayesian model, we assessed factors contributing to the probability of islands having a history of any dengue outbreaks and of having frequent outbrea...

  15. Contribution of Epidemiological Predictors in Unraveling the Phylogeographic History of HIV-1 Subtype C in Brazil

    Science.gov (United States)

    Vrancken, Bram; Maletich Junqueira, Dennis; de Medeiros, Rúbia Marília; Suchard, Marc A.; Lemey, Philippe; Esteves de Matos Almeida, Sabrina; Pinto, Aguinaldo Roberto

    2015-01-01

    ABSTRACT The phylogeographic history of the Brazilian HIV-1 subtype C (HIV-1C) epidemic is still unclear. Previous studies have mainly focused on the capital cities of Brazilian federal states, and the fact that HIV-1C infections increase at a higher rate than subtype B infections in Brazil calls for a better understanding of the process of spatial spread. A comprehensive sequence data set sampled across 22 Brazilian locations was assembled and analyzed. A Bayesian phylogeographic generalized linear model approach was used to reconstruct the spatiotemporal history of HIV-1C in Brazil, considering several potential explanatory predictors of the viral diffusion process. Analyses were performed on several subsampled data sets in order to mitigate potential sample biases. We reveal a central role for the city of Porto Alegre, the capital of the southernmost state, in the Brazilian HIV-1C epidemic (HIV-1C_BR), and the northward expansion of HIV-1C_BR could be linked to source populations with higher HIV-1 burdens and larger proportions of HIV-1C infections. The results presented here bring new insights to the continuing discussion about the HIV-1C epidemic in Brazil and raise an alternative hypothesis for its spatiotemporal history. The current work also highlights how sampling bias can confound phylogeographic analyses and demonstrates the importance of incorporating external information to protect against this. IMPORTANCE Subtype C is responsible for the largest HIV infection burden worldwide, but our understanding of its transmission dynamics remains incomplete. Brazil witnessed a relatively recent introduction of HIV-1C compared to HIV-1B, but it swiftly spread throughout the south, where it now circulates as the dominant variant. The northward spread has been comparatively slow, and HIV-1B still prevails in that region. While epidemiological data and viral genetic analyses have both independently shed light on the dynamics of spread in isolation, their combination

  16. Bayesian-based Wavelet Shrinkage for SAR Image Despeckling Using Cycle Spinning

    Institute of Scientific and Technical Information of China (English)

    ZHANG De-xiang; GAO Qing-wei; CHEN Jun-ning

    2006-01-01

    A novel and efficient speckle noise reduction algorithm based on Bayesian wavelet shrinkage using cycle spinning is proposed. First, the sub-band decompositions of non-logarithmically transformed SAR images are shown. Then, a Bayesian wavelet shrinkage factor is applied to the decomposed data to estimate noise-free wavelet coefficients. The method is based on the Mixture Gaussian Distributed (MGD) modeling of sub-band coefficients. Finally, multi-resolution wavelet coefficients are reconstructed by wavelet-threshold using cycle spinning. Experimental results show that the proposed despeckling algorithm is possible to achieve an excellent balance between suppresses speckle effectively and preserves as many image details and sharpness as possible. The new method indicated its higher performance than the other speckle noise reduction techniques and minimizing the effect of pseudo-Gibbs phenomena.

  17. BioEM: GPU-accelerated computing of Bayesian inference of electron microscopy images

    CERN Document Server

    Cossio, Pilar; Baruffa, Fabio; Rampp, Markus; Lindenstruth, Volker; Hummer, Gerhard

    2016-01-01

    In cryo-electron microscopy (EM), molecular structures are determined from large numbers of projection images of individual particles. To harness the full power of this single-molecule information, we use the Bayesian inference of EM (BioEM) formalism. By ranking structural models using posterior probabilities calculated for individual images, BioEM in principle addresses the challenge of working with highly dynamic or heterogeneous systems not easily handled in traditional EM reconstruction. However, the calculation of these posteriors for large numbers of particles and models is computationally demanding. Here we present highly parallelized, GPU-accelerated computer software that performs this task efficiently. Our flexible formulation employs CUDA, OpenMP, and MPI parallelization combined with both CPU and GPU computing. The resulting BioEM software scales nearly ideally both on pure CPU and on CPU+GPU architectures, thus enabling Bayesian analysis of tens of thousands of images in a reasonable time. The g...

  18. Bayesians versus frequentists a philosophical debate on statistical reasoning

    CERN Document Server

    Vallverdú, Jordi

    2016-01-01

    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept introduced in the book’s final section. This monograph will be of interest to philosophers and historians of science and students in related fields. Despite the mathematical nature of the topic, no statistical background is required, making the book a valuable read for anyone interested in the history of statistics and human cognition.

  19. The life history of Pseudometagea schwarzii, with a discussion of the evolution of endoparasitism and koinobiosis in Eucharitidae and Perilampidae (Chalcidoidea

    Directory of Open Access Journals (Sweden)

    John Heraty

    2013-10-01

    Full Text Available The immature stages and behavior of Pseudometagea schwarzii (Ashmead (Hymenoptera: Eucharitidae: Eucharitini are described, and the presence of an endoparasitic planidium that undergoes growth-feeding in the larva of the host ant (Lasius neoniger Emery is confirmed. Bayesian inference and parsimony ancestral state reconstruction are used to map the evolution of endoparasitism across the eucharitid-perilampid clade. Endoparasitism is proposed to have evolved independently three times within Eucharitidae, including once in Pseudometagea Ashmead, and at least twice in Perilampus Latreille. Endoparasitism is independent as an evolutionary trait from other life history traits such as differences in growth and development of the first-instar larva, hypermetamorphic larval morphology, and other biological traits, including koinobiosis.

  20. Climate history of the Southern Hemisphere Westerlies belt during the last glacial-interglacial transition revealed from lake water oxygen isotope reconstruction of Laguna Potrok Aike (52° S, Argentina

    Directory of Open Access Journals (Sweden)

    J. Zhu

    2014-05-01

    Full Text Available The Southern Hemisphere westerly winds (SHW play a crucial role in the large-scale ocean circulation and global carbon cycling. Accordingly, the reconstruction of its latitudinal position and intensity is essential for understanding global climatic fluctuations during the last glacial cycle. The southernmost part of the South American continent is of great importance for paleoclimate studies as the only continental mass intersecting a large part of the SHW belt. However, continuous proxy records back to the last Glacial are rare in southern Patagonia, owing to the Patagonian Ice Sheets expanding from the Andean area and the scarcity of continuous paleoclimate archives in extra-Andean Patagonia. Here, we present an oxygen isotope record from cellulose and purified bulk organic matter of aquatic moss shoots from the last glacial-interglacial transition preserved in the sediments of Laguna Potrok Aike (52° S, 70° W, a deep maar lake located in semi-arid, extra-Andean Patagonia. The highly significant correlation between oxygen isotope values of aquatic mosses and their host waters and the abundant well-preserved moss remains allow a high-resolution oxygen isotope reconstruction of lake water (δ18Olw for this lake. Long-term δ18Olw variations are mainly determined by δ18O changes of the source water of lake, surface air temperature and evaporative 18O enrichment. Under permafrost conditions during the Glacial, the groundwater may not be recharged by regional precipitation. The isolated groundwater could have had much less negative δ18O values than glacial precipitation. The less 18O depleted source water and prolonged lake water residence time caused by reduced interchange between in- and outflows could have resulted in the reconstructed glacial δ18Olw that was only ca. 3‰ lower than modern values. The significant two-step rise in reconstructed δ18Olw during the last deglaciation demonstrated the response of isotope composition of lake

  1. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  2. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  3. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  4. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  5. A Novel Prediction Algorithm of DR Position Error Based on Bayesian Regularization Back-propagation Neural Network

    OpenAIRE

    Li Honglian; Fang Hong; Tang Ju; Zhang Jun; Zhang Jing

    2013-01-01

    It is difficult to accurately reckon vehicle position for vehicle navigation system (VNS) during GPS outages, a novel prediction algorithm of dead reckon (DR) position error is put forward, which based on Bayesian regularization back-propagation (BRBP) neural network. DR, GPS position data are first de-noised and compared at different stationary wavelet transformation (SWT) decomposition level, and DR position error data are acquired after the SWT coefficients differences are reconstructed. A...

  6. Physalis and physaloids: A recent and complex evolutionary history.

    Science.gov (United States)

    Zamora-Tavares, María Del Pilar; Martínez, Mahinda; Magallón, Susana; Guzmán-Dávalos, Laura; Vargas-Ponce, Ofelia

    2016-07-01

    The complex evolutionary history of the subtribe Physalinae is reflected in the poor resolution of the relationships of Physalis and the physaloid genera. We hypothesize that this low resolution is caused by recent evolutionary history in a complex geographic setting. The aims of this study were twofold: (1) To determine the phylogenetic relationships of the current genera recognized in Physalinae in order to identify monophyletic groups and resolve the physaloid grade; and (2) to determine the probable causes of the recent divergence in Physalinae. We conducted phylogenetic analyses with maximum likelihood (ML) and Bayesian inference with 50 Physalinae species and 19 others as outgroups, using morphological and molecular data from five plastid and two nuclear regions. A relaxed molecular clock was obtained from the ML topology and ancestral area reconstruction was conducted using the DEC model. The genera Chamaesaracha, Leucophysalis, and Physalis subgenus Rydbergis were recovered as monophyletic. Three clades, Alkekengi-Calliphysalis, Schraderanthus-Tzeltalia, and Witheringia-Brachistus, also received good support. However, even with morphological data and that of the DNA of seven regions, the tree was not completely resolved and many clades remained unsupported. Physalinae diverged at the end of the Miocene (∼9.22Mya) with one trend indicating that the greatest diversification within the subtribe occurred during the last 5My. The Neotropical region presented the highest probability (45%) of being the ancestral area of Physalinae followed by the Mexican Transition Zone (35%). During the Pliocene and Pleistocene, the geographical areas where species were found experienced significant geological and climatic changes, giving rise to rapid and relatively recent diversification events in Physalinae. Thus, recent origin, high diversification, and morphological complexity have contributed, at least with the currently available methods, to the inability to completely

  7. Re-telling, Re-evaluating and Re-constructing

    Directory of Open Access Journals (Sweden)

    Gorana Tolja

    2013-11-01

    Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.

  8. A Sparse Bayesian Imaging Technique for Efficient Recovery of Reservoir Channels With Time-Lapse Seismic Measurements

    KAUST Repository

    Sana, Furrukh

    2016-06-01

    Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\\\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.

  9. Cosmic expansion history from SN Ia data via information field theory

    CERN Document Server

    Porqueres, Natàlia; Greiner, Maksim; Böhm, Vanessa; Dorn, Sebastian; Ruiz-Lapuente, Pilar; Manrique, Alberto

    2016-01-01

    We present a novel inference algorithm that reconstructs the cosmic expansion history as encoded in the Hubble parameter $H(z)$ from SNe Ia data. The novelty of the approach lies in the usage of information field theory, a statistical field theory that is very well suited for the construction of optimal signal recovery algorithms. The algorithm infers non-parametrically $s(a)=\\ln(\\rho(a)/\\rho_{\\mathrm{crit}0})$, the density evolution which determines $H(z)$, without assuming an analytical form of $\\rho(a)$ but only its smoothness with the scale factor $a=(1+z)^{-1}$. The inference problem of recovering the signal $s(a)$ from the data is formulated in a fully Bayesian way. In detail, we rewrite the signal as the sum of a background cosmology and a perturbation. This allows to determine the maximum a posteriory estimate of the signal by an iterative Wiener filter method. Applying this method to the Union2.1 supernova compilation, we recover a cosmic expansion history that is fully compatible with the standard $...

  10. Morphological homoplasy, life history evolution, and historical biogeography of plethodontid salamanders inferred from complete mitochondrial genomes

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Rachel Lockridge; Macey, J. Robert; Jaekel, Martin; Wake, David B.; Boore, Jeffrey L.

    2004-08-01

    The evolutionary history of the largest salamander family (Plethodontidae) is characterized by extreme morphological homoplasy. Analysis of the mechanisms generating such homoplasy requires an independent, molecular phylogeny. To this end, we sequenced 24 complete mitochondrial genomes (22 plethodontids and two outgroup taxa), added data for three species from GenBank, and performed partitioned and unpartitioned Bayesian, ML, and MP phylogenetic analyses. We explored four dataset partitioning strategies to account for evolutionary process heterogeneity among genes and codon positions, all of which yielded increased model likelihoods and decreased numbers of supported nodes in the topologies (PP > 0.95) relative to the unpartitioned analysis. Our phylogenetic analyses yielded congruent trees that contrast with the traditional morphology-based taxonomy; the monophyly of three out of four major groups is rejected. Reanalysis of current hypotheses in light of these new evolutionary relationships suggests that (1) a larval life history stage re-evolved from a direct-developing ancestor multiple times, (2) there is no phylogenetic support for the ''Out of Appalachia'' hypothesis of plethodontid origins, and (3) novel scenarios must be reconstructed for the convergent evolution of projectile tongues, reduction in toe number, and specialization for defensive tail loss. Some of these novel scenarios imply morphological transformation series that proceed in the opposite direction than was previously thought. In addition, they suggest surprising evolutionary lability in traits previously interpreted to be conservative.

  11. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  12. Adaptive approximate Bayesian computation for complex models

    CERN Document Server

    Lenormand, Maxime; Deffuant, Guillaume

    2011-01-01

    Approximate Bayesian computation (ABC) is a family of computational techniques in Bayesian statistics. These techniques allow to fit a model to data without relying on the computation of the model likelihood. They instead require to simulate a large number of times the model to be fitted. A number of refinements to the original rejection-based ABC scheme have been proposed, including the sequential improvement of posterior distributions. This technique allows to decrease the number of model simulations required, but it still presents several shortcomings which are particularly problematic for costly to simulate complex models. We here provide a new algorithm to perform adaptive approximate Bayesian computation, which is shown to perform better on both a toy example and a complex social model.

  13. Learning Bayesian Networks from Correlated Data

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  14. Bayesian Fusion of Multi-Band Images

    CERN Document Server

    Wei, Qi; Tourneret, Jean-Yves

    2013-01-01

    In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...

  15. Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    CERN Document Server

    Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon

    2012-01-01

    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...

  16. Bayesian inference of the metazoan phylogeny

    DEFF Research Database (Denmark)

    Glenner, Henrik; Hansen, Anders J; Sørensen, Martin V;

    2004-01-01

    been the only feasible combined approach but is highly sensitive to long-branch attraction. Recent development of stochastic models for discrete morphological characters and computationally efficient methods for Bayesian inference has enabled combined molecular and morphological data analysis...... with rigorous statistical approaches less prone to such inconsistencies. We present the first statistically founded analysis of a metazoan data set based on a combination of morphological and molecular data and compare the results with a traditional parsimony analysis. Interestingly, the Bayesian analyses...... such as the ecdysozoans and lophotrochozoans. Parsimony, on the contrary, shows conflicting results, with morphology being congruent to the Bayesian results and the molecular data set producing peculiarities that are largely reflected in the combined analysis....

  17. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2016-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...

  18. Event generator tuning using Bayesian optimization

    CERN Document Server

    Ilten, Philip; Yang, Yunjie

    2016-01-01

    Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.

  19. Hessian PDF reweighting meets the Bayesian methods

    CERN Document Server

    Paukkunen, Hannu

    2014-01-01

    We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.

  20. A Large Sample Study of the Bayesian Bootstrap

    OpenAIRE

    Lo, Albert Y.

    1987-01-01

    An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.

  1. PROPRIOCEPTION, BODY BALANCE AND FUNCTIONALITY IN INDIVIDUALS WITH ACL RECONSTRUCTION

    OpenAIRE

    Furlanetto, Tássia Silveira; Peyré-Tartaruga, Leonardo Alexandre; do Pinho, Alexandre Severo; Bernardes, Emanuele da Silva; Zaro, Milton Antonio

    2016-01-01

    Objective : To evaluate and compare proprioception, body balance and knee functionality of individuals with or without unilateral anterior cruciate ligament (ACL) reconstruction. Methods : Forty individuals were divided in two groups: Experimental group, 20 individuals with ACL reconstruction at six months postoperative, and control group, 20 individuals with no history of lower limb pathologies. In the experimental group, we assessed lower limbs with reconstructed ACL and contralateral limb;...

  2. Reconstructing the Alcatraz escape

    Science.gov (United States)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  3. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  4. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  5. Bayesian Optimisation Algorithm for Nurse Scheduling

    CERN Document Server

    Li, Jingpeng

    2008-01-01

    Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.

  6. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  7. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;

    2011-01-01

    We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  8. Comparison of the Bayesian and Frequentist Approach to the Statistics

    OpenAIRE

    Hakala, Michal

    2015-01-01

    The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...

  9. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    -posed estimation problem, where the reconstruction most often has been done by non-linear least squares techniques separately for each entity. The minmal model was originally specified for a single individual and does not combine several individuals with the advantage of estimating the metabolic portrait...... to a population-based model. The estimation of the parameters are efficiently implemented in a Bayesian approach where posterior inference is made through the use of Markov chain Monte Carlo techniques. Hereby we obtain a powerful and flexible modelling framework for regularizing the ill-posed estimation problem...

  10. A Bayesian Super-Resolution Approach to Demosaicing of Blurred Images

    Directory of Open Access Journals (Sweden)

    Molina Rafael

    2006-01-01

    Full Text Available Most of the available digital color cameras use a single image sensor with a color filter array (CFA in acquiring an image. In order to produce a visible color image, a demosaicing process must be applied, which produces undesirable artifacts. An additional problem appears when the observed color image is also blurred. This paper addresses the problem of deconvolving color images observed with a single coupled charged device (CCD from the super-resolution point of view. Utilizing the Bayesian paradigm, an estimate of the reconstructed image and the model parameters is generated. The proposed method is tested on real images.

  11. Microwave imaging from experimental data within a Bayesian framework with realistic random noise

    Science.gov (United States)

    Eyraud, C.; Litman, A.; Hérique, A.; Kofman, W.

    2009-02-01

    This paper deals with the reconstruction of three-dimensional targets from experimental multiple-frequency data measured in the anechoic chamber of the Institut Fresnel (Marseille, France). An inverse iterative scheme is implemented with an adequate choice of the cost functional. Indeed, a Bayesian approach is considered in order to take into account the random noise which is present in the experiment. This leads to the definition of an adequate cost functional, where the weighting coefficients are changing with the frequency, the incidence angle and the receiving angle. The inversion scheme turns out to be more robust and accurate.

  12. Microwave imaging from experimental data within a Bayesian framework with realistic random noise

    International Nuclear Information System (INIS)

    This paper deals with the reconstruction of three-dimensional targets from experimental multiple-frequency data measured in the anechoic chamber of the Institut Fresnel (Marseille, France). An inverse iterative scheme is implemented with an adequate choice of the cost functional. Indeed, a Bayesian approach is considered in order to take into account the random noise which is present in the experiment. This leads to the definition of an adequate cost functional, where the weighting coefficients are changing with the frequency, the incidence angle and the receiving angle. The inversion scheme turns out to be more robust and accurate

  13. 从沉默到说话--论《我儿子的故事》中的历史书写与重构%From Silence to Speech---On History writing and Reconstruction

    Institute of Scientific and Technical Information of China (English)

    姜梦

    2014-01-01

    非洲的历史大多是宗主国所书写的,种族隔离时期的审查制度使非洲的历史书写更加陷入失语状态。《我儿子的故事》中戈迪默改变传统叙述人称和叙述方式,让一位黑人青年作为主要叙述者,并在第三人称叙述中加入不同肤色人物的自由间接引语,消解帝国历史书写的霸权,同时以艾拉和贝比为代表的黑人女性推翻了第三世界女性父权化和殖民化的命运,书写了非洲女性的历史。帝国大写历史的消解和黑人女性主体地位的建构在解构白人中心主义的同时达到了反写南非历史的目的。%The history of Africa was mostly written by colonial powers. Censorship during apartheid almost rendered Africa aphasic. In My Son’s Story, by changing the traditional way of narrating, Gordimer employs a black narrator and adds to the narration free indirect speeches of characters’ of both races, hence subverting the power of imperial historical writing; Aila and Baby, representatives of black women, have broken the control of patriarchy and colonization, thus writing the history of African women. The deconstruction of imperial History and the construction of the subjectivity of black women have rewritten the history of South Africa while disintegrating whites’ centralism.

  14. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  15. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    R. Wetzels; R.P.P.P. Grasman; E.J. Wagenmakers

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig

  16. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t

  17. Bayesian Just-So Stories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.; Davis, Colin J.

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…

  18. Reconstruction of CT images by the Bayes- back projection method

    CERN Document Server

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  19. Learning history

    NARCIS (Netherlands)

    Peters, Richard

    2015-01-01

    Hilary speaks here with Professor Rik Peters an historian-philosopher who develops the Learning History as a method for engaging with socially relevant action in Holland. Thinking of history as a way to bridge to practical scenario planning, he is helping cities in Holland grapple with integration o

  20. Intellectual History

    DEFF Research Database (Denmark)

    In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...