WorldWideScience

Sample records for bayesian history reconstruction

  1. Bayesian History Reconstruction of Complex Human Gene Clusters on a Phylogeny

    CERN Document Server

    Vinař, Tomáš; Song, Giltae; Siepel, Adam

    2009-01-01

    Clusters of genes that have evolved by repeated segmental duplication present difficult challenges throughout genomic analysis, from sequence assembly to functional analysis. Improved understanding of these clusters is of utmost importance, since they have been shown to be the source of evolutionary innovation, and have been linked to multiple diseases, including HIV and a variety of cancers. Previously, Zhang et al. (2008) developed an algorithm for reconstructing parsimonious evolutionary histories of such gene clusters, using only human genomic sequence data. In this paper, we propose a probabilistic model for the evolution of gene clusters on a phylogeny, and an MCMC algorithm for reconstruction of duplication histories from genomic sequences in multiple species. Several projects are underway to obtain high quality BAC-based assemblies of duplicated clusters in multiple species, and we anticipate that our method will be useful in analyzing these valuable new data sets.

  2. Bayesian tomographic reconstruction of microsystems

    Science.gov (United States)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-11-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.

  3. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  4. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  5. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  6. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    OpenAIRE

    Fernandes, Ricardo; Millard, Andrew R.; Brabec, Marek; Nadeau, Marie-Josée; Grootes, Pieter

    2014-01-01

    Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals) was ...

  7. Hierarchical Bayesian sparse image reconstruction with application to MRFM

    CERN Document Server

    Dobigeon, Nicolas; Tourneret, Jean-Yves

    2008-01-01

    This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g. by maximizing the estimated posterior distribution. In our fully Bayesian approach the posteriors of all the parameters are available. Thus our algorithm provides more information than other previously proposed sparse reconstr...

  8. Reconstruction of elongated bubbles fusing the information from multiple optical probes through a Bayesian inference technique

    Science.gov (United States)

    Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr.

    2016-07-01

    In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.

  9. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.

    2014-01-01

    Roč. 9, č. 2 (2014), Art. no. e87436. E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014

  10. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    International Nuclear Information System (INIS)

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  11. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    Science.gov (United States)

    Barat, Éric; Dautremer, Thomas

    2007-11-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution—normalized emission intensity of the spatial poisson process—is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM.

  12. Bayesian 3D velocity field reconstruction with VIRBIUS

    Science.gov (United States)

    Lavaux, Guilhem

    2016-03-01

    I describe a new Bayesian-based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of e.g. the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIUS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and deep distance surveys for which individual errors on distance would impede velocity field inference.

  13. Bayesian 3d velocity field reconstruction with VIRBIuS

    CERN Document Server

    Lavaux, G

    2015-01-01

    I describe a new Bayesian based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of, e.g., the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIuS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3,000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and d...

  14. A Bayesian Method for Estimating Evolutionary History

    OpenAIRE

    Kim, Joungyoun; Anthony, Nicola M; Larget, Bret R.

    2012-01-01

    Phylogeography is the study of evolutionary history among populations in a species associated with geographic genetic variation. This paper examines the phylogeography of three African gorilla subspecies based on two types of DNA sequence data. One type is HV1, the first hyper-variable region in the control region of the mitochondrial genome. The other type is nuclear mitochondrial DNA (Numt DNA), which results from the introgression of a copy of HV1 from the mitochondrial genome into the nuc...

  15. A new Bayesian approach to the reconstruction of spectral functions

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function $m(\\omega)$ only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter $\\alpha$ in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of $P[\\rho|D]$ in the full $N_\\omega\\gg N_\\tau$ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width ...

  16. Reconstruction of Zeff profiles at TEXTOR through Bayesian source separation

    International Nuclear Information System (INIS)

    We describe a work in progress on the reconstruction of radial profiles for the ion effective charge Zeff on the TEXTOR tokamak, using statistical data analysis techniques. We introduce our diagnostic for the measurement of Bremsstrahlung emissivity signals. Zeff profiles can be determined by Abel inversion of line-integrated measurements of the Bremsstrahlung emissivity (εff) from the plasma and the plasma electron density (ne) and temperature (Te). However, at the plasma edge only estimated values are routinely used for ne and Te, which are moreover determined at different toroidal locations. These various uncertainties hinder the interpretation of a Zeff profile outside the central plasma. In order to circumvent this problem, we propose several scenarios meant to allow the extraction by (Bayesian) Blind Source Separation techniques of either (line-integrated) Zeff wave shapes or absolutely calibrated signals from (line-integrated) emissivity signals, using also density and temperature signals, as required. (authors)

  17. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    OpenAIRE

    Cahill, N.; Kemp, A. C.; Horton, B. P.; Parnell, A.C.

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation,...

  18. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    OpenAIRE

    Cahill, Niamh; Kemp, Andrew C.; Horton, Benjamin P.; Andrew C Parnell

    2016-01-01

    We present a Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) a new Bayesian transfer (B-TF) function for the calibration of biological indicators into tidal elevation, which is fl...

  19. Hominin life history: reconstruction and evolution.

    Science.gov (United States)

    Robson, Shannen L; Wood, Bernard

    2008-04-01

    In this review we attempt to reconstruct the evolutionary history of hominin life history from extant and fossil evidence. We utilize demographic life history theory and distinguish life history variables, traits such as weaning, age at sexual maturity, and life span, from life history-related variables such as body mass, brain growth, and dental development. The latter are either linked with, or can be used to make inferences about, life history, thus providing an opportunity for estimating life history parameters in fossil taxa. We compare the life history variables of modern great apes and identify traits that are likely to be shared by the last common ancestor of Pan-Homo and those likely to be derived in hominins. All great apes exhibit slow life histories and we infer this to be true of the last common ancestor of Pan-Homo and the stem hominin. Modern human life histories are even slower, exhibiting distinctively long post-menopausal life spans and later ages at maturity, pointing to a reduction in adult mortality since the Pan-Homo split. We suggest that lower adult mortality, distinctively short interbirth intervals, and early weaning characteristic of modern humans are derived features resulting from cooperative breeding. We evaluate the fidelity of three life history-related variables, body mass, brain growth and dental development, with the life history parameters of living great apes. We found that body mass is the best predictor of great ape life history events. Brain growth trajectories and dental development and eruption are weakly related proxies and inferences from them should be made with caution. We evaluate the evidence of life history-related variables available for extinct species and find that prior to the transitional hominins there is no evidence of any hominin taxon possessing a body size, brain size or aspects of dental development much different from what we assume to be the primitive life history pattern for the Pan-Homo clade. Data for

  20. Efficient reconstruction of contaminant release history

    Energy Technology Data Exchange (ETDEWEB)

    Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL

    2009-01-01

    We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.

  1. Bayesian inference of the demographic history of chimpanzees.

    Science.gov (United States)

    Wegmann, Daniel; Excoffier, Laurent

    2010-06-01

    Due to an almost complete absence of fossil record, the evolutionary history of chimpanzees has only been studied recently on the basis of genetic data. Although the general topology of the chimpanzee phylogeny is well established, uncertainties remain concerning the size of current and past populations, the occurrence of bottlenecks or population expansions, or about divergence times and migrations rates between subspecies. Here, we present a novel attempt at globally inferring the detailed evolution of the Pan genus based on approximate Bayesian computation, an approach preferentially applied to complex models where the likelihood cannot be computed analytically. Based on two microsatellite and DNA sequence data sets and adjusting simulated data for local levels of inbreeding and patterns of missing data, we find support for several new features of chimpanzee evolution as compared with previous studies based on smaller data sets and simpler evolutionary models. We find that the central chimpanzees are certainly the oldest population of all P. troglodytes subspecies and that the other two P. t. subspecies diverged from the central chimpanzees by founder events. We also find an older divergence time (1.6 million years [My]) between common chimpanzee and Bonobos than previous studies (0.9-1.3 My), but this divergence appears to have been very progressive with the maintenance of relatively high levels of gene flow between the ancestral chimpanzee population and the Bonobos. Finally, we could also confirm the existence of strong unidirectional gene flow from the western into the central chimpanzee. These results show that interesting and innovative features of chimpanzee history emerge when considering their whole evolutionary history in a single analysis, rather than relying on simpler models involving several comparisons of pairs of populations. PMID:20118191

  2. Reconstructing the history of dark energy using maximum entropy

    OpenAIRE

    Zunckel, C.; Trotta, R.

    2007-01-01

    We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS pro...

  3. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  4. A novel reconstruction algorithm for bioluminescent tomography based on Bayesian compressive sensing

    Science.gov (United States)

    Wang, Yaqi; Feng, Jinchao; Jia, Kebin; Sun, Zhonghua; Wei, Huijun

    2016-03-01

    Bioluminescence tomography (BLT) is becoming a promising tool because it can resolve the biodistribution of bioluminescent reporters associated with cellular and subcellular function through several millimeters with to centimeters of tissues in vivo. However, BLT reconstruction is an ill-posed problem. By incorporating sparse a priori information about bioluminescent source, enhanced image quality is obtained for sparsity based reconstruction algorithm. Therefore, sparsity based BLT reconstruction algorithm has a great potential. Here, we proposed a novel reconstruction method based on Bayesian compressive sensing and investigated its feasibility and effectiveness with a heterogeneous phantom. The results demonstrate the potential and merits of the proposed algorithm.

  5. Sparse reconstruction using distribution agnostic bayesian matching pursuit

    KAUST Repository

    Masood, Mudassir

    2013-11-01

    A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.

  6. Comparing Nonparametric Bayesian Tree Priors for Clonal Reconstruction of Tumors

    OpenAIRE

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2014-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstructio...

  7. Texture-preserving Bayesian image reconstruction for low-dose CT

    Science.gov (United States)

    Zhang, Hao; Han, Hao; Hu, Yifan; Liu, Yan; Ma, Jianhua; Li, Lihong; Moore, William; Liang, Zhengrong

    2016-03-01

    Markov random field (MRF) model has been widely used in Bayesian image reconstruction to reconstruct piecewise smooth images in the presence of noise, such as in low-dose X-ray computed tomography (LdCT). While it can preserve edge sharpness via edge-preserving potential function, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it compromises clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodule or colon polyp. This study aims to shift the edge preserving regional noise smoothing paradigm to texture-preserving framework for LdCT image reconstruction while retaining the advantage of MRF's neighborhood system on edge preservation. Specifically, we adapted the MRF model to incorporate the image textures of lung, bone, fat, muscle, etc. from previous full-dose CT scan as a priori knowledge for texture-preserving Bayesian reconstruction of current LdCT images. To show the feasibility of proposed reconstruction framework, experiments using clinical patient scans (with lung nodule or colon polyp) were conducted. The experimental outcomes showed noticeable gain by the a priori knowledge for LdCT image reconstruction with the well-known Haralick texture measures. Thus, it is conjectured that texture-preserving LdCT reconstruction has advantages over edge-preserving regional smoothing paradigm for texture-specific clinical applications.

  8. A novel Bayesian approach to spectral function reconstruction

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the MEM. We present a realistic test of our method in the context of the non-perturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. An improved potential estimation from previously investigated quenched lattice QCD correlators is provided.

  9. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author)

  10. Milestones in the History of Ear Reconstruction.

    Science.gov (United States)

    Berghaus, Alexander; Nicoló, Marion San

    2015-12-01

    The reconstruction of ear deformities has been challenging plastic surgeons since centuries. However, it is only in the 19th century that reports on partial and total ear reconstruction start increasing. In the quest for an aesthetically pleasing and natural-looking result, surgeons worked on the perfect framework and skin coverage. Different materials and flap techniques have evolved. Some were abandoned out of frustration, while others kept evolving over the years. In this article, we discuss the milestones in ear reconstruction-from ancient times to early attempts in Western civilization to the key chapters of ear reconstruction in the 20th century leading to the current techniques. PMID:26667630

  11. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    Science.gov (United States)

    Tang, Jing; Rahmim, Arman

    2009-12-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  12. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    Directory of Open Access Journals (Sweden)

    N. Cahill

    2015-10-01

    Full Text Available We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera and geochemical (δ13C sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1 A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment δ13C values, (2 A chronology developed from an existing Bchron age-depth model, and (3 An existing errors-in-variables integrated Gaussian process (EIV-IGP model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey. USA We develop a new Bayesian transfer function (B-TF, with and without the δ13C proxy and compare our results to those from a widely-used weighted-averaging transfer function (WA-TF. The formal incorporation of a second proxy into the B-TF model results in smaller vertical uncertainties and improved accuracy for reconstructed RSL. The vertical uncertainty from the multi-proxy B-TF is ∼ 28 % smaller on average compared to the WA-TF. When evaluated against historic tide-gauge measurements, the multi-proxy B-TF most accurately reconstructs the RSL changes observed in the instrumental record (MSE = 0.003 m2. The holistic model provides a single, unifying framework for reconstructing and analysing sea level through time. This approach is suitable for reconstructing other paleoenvironmental variables using biological proxies.

  13. A Bayesian hierarchical model for reconstructing relative sea level: from raw data to rates of change

    Science.gov (United States)

    Cahill, Niamh; Kemp, Andrew C.; Horton, Benjamin P.; Parnell, Andrew C.

    2016-02-01

    We present a Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) a new Bayesian transfer (B-TF) function for the calibration of biological indicators into tidal elevation, which is flexible enough to formally accommodate additional proxies; (2) an existing chronology developed using the Bchron age-depth model, and (3) an existing Errors-In-Variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. Our approach is illustrated using a case study of Common Era sea-level variability from New Jersey, USA We develop a new B-TF using foraminifera, with and without the additional (δ13C) proxy and compare our results to those from a widely used weighted-averaging transfer function (WA-TF). The formal incorporation of a second proxy into the B-TF model results in smaller vertical uncertainties and improved accuracy for reconstructed RSL. The vertical uncertainty from the multi-proxy B-TF is ˜ 28 % smaller on average compared to the WA-TF. When evaluated against historic tide-gauge measurements, the multi-proxy B-TF most accurately reconstructs the RSL changes observed in the instrumental record (mean square error = 0.003 m2). The Bayesian hierarchical model provides a single, unifying framework for reconstructing and analyzing sea-level change through time. This approach is suitable for reconstructing other paleoenvironmental variables (e.g., temperature) using biological proxies.

  14. Reconstruction of Neutral Hydrogen Density Profiles in HANBIT Magnetic Mirror Device Using Bayesian Probability Theory

    International Nuclear Information System (INIS)

    Hydrogen is the main constitute of plasmas in HANBIT magnetic mirror device, therefore, measurement of the emission from excited levels of hydrogen atoms is an important diagnostic tool. From the emissivity of Hα radiation one can derive quantities such as the neutral hydrogen density and the source rate. An unbiased and consistent probability theory based approach within the framework of Bayesian inference is applied to the reconstruction of Hα emissivity profiles and hydrogen neutral density profiles in HANBIT magnetic mirror device

  15. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    Energy Technology Data Exchange (ETDEWEB)

    Qi, Jinyi

    2003-01-10

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE.

  16. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    International Nuclear Information System (INIS)

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

  17. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  18. Bayesian image reconstruction for emission tomography based on median root prior

    International Nuclear Information System (INIS)

    The aim of the present study was to investigate a new type of Bayesian one-step late reconstruction method which utilizes a median root prior (MRP). The method favours images which have locally monotonous radioactivity concentrations. The new reconstruction algorithm was applied to ideal simulated data, phantom data and some patient examinations with PET. The same projection data were reconstructed with filtered back-projection (FBP) and maximum likelihood-expectation maximization (ML-EM) methods for comparison. The MRP method provided good-quality images with a similar resolution to the FBP method with a ramp filter, and at the same time the noise properties were as good as with Hann-filtered FBP images. The typical artefacts seen in FBP reconstructed images outside of the object were completely removed, as was the grainy noise inside the object. Quantitativley, the resulting average regional radioactivity concentrations in a large region of interest in images produced by the MRP method corresponded to the FBP and ML-EM results but at the pixel by pixel level the MRP method proved to be the most accurate of the tested methods. In contrast to other iterative reconstruction methods, e.g. ML-EM, the MRP method was not sensitive to the number of iterations nor to the adjustment of reconstruction parameters. Only the Bayesian parameter β had to be set. The proposed MRP method is much more simple to calculate than the methods described previously, both with regard to the parameter settings and in terms of general use. The new MRP reconstruction method was shown to produce high-quality quantitative emission images with only one parameter setting in addition to the number of iterations. (orig.)

  19. Bayesian inference of population size history from multiple loci

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2008-10-01

    Full Text Available Abstract Background Effective population size (Ne is related to genetic variability and is a basic parameter in many models of population genetics. A number of methods for inferring current and past population sizes from genetic data have been developed since JFC Kingman introduced the n-coalescent in 1982. Here we present the Extended Bayesian Skyline Plot, a non-parametric Bayesian Markov chain Monte Carlo algorithm that extends a previous coalescent-based method in several ways, including the ability to analyze multiple loci. Results Through extensive simulations we show the accuracy and limitations of inferring population size as a function of the amount of data, including recovering information about evolutionary bottlenecks. We also analyzed two real data sets to demonstrate the behavior of the new method; a single gene Hepatitis C virus data set sampled from Egypt and a 10 locus Drosophila ananassae data set representing 16 different populations. Conclusion The results demonstrate the essential role of multiple loci in recovering population size dynamics. Multi-locus data from a small number of individuals can precisely recover past bottlenecks in population size which can not be characterized by analysis of a single locus. We also demonstrate that sequence data quality is important because even moderate levels of sequencing errors result in a considerable decrease in estimation accuracy for realistic levels of population genetic variability.

  20. A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change

    CERN Document Server

    Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C

    2015-01-01

    We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...

  1. Reconstructing the evolutionary history of natural languages

    Energy Technology Data Exchange (ETDEWEB)

    Warnow, T.; Ringe, D.; Taylor, A. [Univ. of Pennsylvania, Philadelphia, PA (United States)

    1996-12-31

    In this paper we present a new methodology for determining the evolutionary history of related languages. Our methodology uses linguistic information encoded as qualitative characters, and provides much greater precision than previous methods. Our analysis of Indo-European (IE) languages resolves questions that have troubled scholars for over a century.

  2. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  3. Accurate reconstruction of insertion-deletion histories by statistical phylogenetics.

    Directory of Open Access Journals (Sweden)

    Oscar Westesson

    Full Text Available The Multiple Sequence Alignment (MSA is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history, it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of several methods for reconstruction of indel history. The methods tested include a relatively new algorithm for statistical marginalization of MSAs that sums over a stochastically-sampled ensemble of the most probable evolutionary histories. For mammalian evolutionary parameters on several different trees, the single most likely history sampled by our algorithm appears less biased than histories reconstructed by other MSA methods. The algorithm can also be used for alignment-free inference, where the MSA is explicitly summed out of the analysis. As an illustration of our method, we discuss reconstruction of the evolutionary histories of human protein-coding genes.

  4. Bayesian three-dimensional reconstruction of toothed whale trajectories: Passive acoustics assisted with visual and tagging measurements

    OpenAIRE

    Laplanche, Christophe

    2012-01-01

    International audience The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error st...

  5. A hierarchical Bayesian approach for reconstructing the Initial Mass Function of Single Stellar Populations

    CERN Document Server

    Dries, M; Koopmans, L V E

    2016-01-01

    Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...

  6. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Lensu, Sanna [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Department of Environmental Health, National Public Health Institute, Kuopio (Finland); Jolkkonen, Jukka [Department of Neuroscience and Neurology, University of Kuopio, Kuopio (Finland); Tuomisto, Leena [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, Tampere (Finland); Kuikka, Jyrki T. [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Niuvanniemi Hospital, Kuopio (Finland)

    2004-07-01

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with {sup 123}I-epidepride and {sup 99m}Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)

  7. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction

    International Nuclear Information System (INIS)

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with 123I-epidepride and 99mTc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)

  8. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction.

    Science.gov (United States)

    Sohlberg, Antti; Lensu, Sanna; Jolkkonen, Jukka; Tuomisto, Leena; Ruotsalainen, Ulla; Kuikka, Jyrki T

    2004-07-01

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with (123)I-epidepride and (99m)Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. PMID:14991246

  9. An Improved Approximate-Bayesian Model-choice Method for Estimating Shared Evolutionary History

    OpenAIRE

    Oaks, Jamie R.

    2014-01-01

    Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa...

  10. Inferring population history with DIYABC: a user-friendly approach to Approximate Bayesian Computation

    OpenAIRE

    Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud

    2008-01-01

    Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in...

  11. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion

    Science.gov (United States)

    Zanini, Andrea; Woodbury, Allan D.

    2016-02-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  12. Bayesian reconstruction of the velocity distribution of weakly interacting massive particles from direct dark matter detection data

    International Nuclear Information System (INIS)

    In this paper, we extended our earlier work on the reconstruction of the (time-averaged) one-dimensional velocity distribution of Galactic Weakly Interacting Massive Particles (WIMPs) and introduce the Bayesian fitting procedure to the theoretically predicted velocity distribution functions. In this reconstruction process, the (rough) velocity distribution reconstructed by using raw data from direct Dark Matter detection experiments directly, i.e. measured recoil energies, with one or more different target materials, has been used as ''reconstructed-input'' information. By assuming a fitting velocity distribution function and scanning the parameter space based on the Bayesian analysis, the astronomical characteristic parameters, e.g. the Solar and Earth's Galactic velocities, will be pinned down as the output results

  13. Bayesian three-dimensional reconstruction of toothed whale trajectories: passive acoustics assisted with visual and tagging measurements.

    Science.gov (United States)

    Laplanche, Christophe

    2012-11-01

    The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error statistical approach by comparing the reconstruction results of 48 simulated sperm whale (Physeter macrocephalus) trajectories. The use of the advanced Bayesian method reduces bias (standard deviation) with respect to the standard method up to a factor of 8.9 (13.6). The author provides open-source software which is functional with acoustic data which would be collected in the field from any three-dimensional receptor array design. This approach renews passive acoustics as a valuable tool to study the underwater behavior of toothed whales. PMID:23145606

  14. Modified gravity and its reconstruction from the universe expansion history

    CERN Document Server

    Nojiri, S; Nojiri, Shin'ichi; Odintsov, Sergei D.

    2006-01-01

    We develop the reconstruction program for the number of modified gravities: scalar-tensor theory, $f(R)$, $F(G)$ and string-inspired, scalar-Gauss-Bonnet gravity. The known (classical) universe expansion history is used for the explicit and successful reconstruction of some versions (of special form or with specific potentials) from all above modified gravities. It is demonstrated that cosmological sequence of matter dominance, decceleration-acceleration transition and acceleration era may always emerge as cosmological solutions of such theory. Moreover, the late-time dark energy FRW universe may have the approximate or exact $\\Lambda$CDM form consistent with three years WMAP data. The principal possibility to extend this reconstruction scheme to include the radiation dominated era and inflation is briefly mentioned. Finally, it is indicated how even modified gravity which does not describe the matter-dominated epoch may have such a solution before acceleration era at the price of the introduction of compensa...

  15. Evolutionary History Reconstruction for Mammalian Complex Gene Clusters

    OpenAIRE

    Zhang, Yu; Song, Giltae; Vinař, Tomáš; Green, Eric D; Siepel, Adam; Miller, Webb

    2009-01-01

    Clusters of genes that evolved from single progenitors via repeated segmental duplications present significant challenges to the generation of a truly complete human genome sequence. Such clusters can confound both accurate sequence assembly and downstream computational analysis, yet they represent a hotbed of functional innovation, making them of extreme interest. We have developed an algorithm for reconstructing the evolutionary history of gene clusters using only human genomic sequence dat...

  16. Reconstructing the modular recombination history of Staphylococcus aureus phages

    OpenAIRE

    Swenson, Krister M; Guertin, Paul; Deschênes, Hugo; Bergeron, Anne

    2013-01-01

    Background Viruses that infect bacteria, called phages, are well-known for their extreme mosaicism, in which an individual genome shares many different parts with many others. The mechanisms for creating these mosaics are largely unknown but are believed to be recombinations, either illegitimate, or partly homologous. In order to reconstruct the history of these recombinations, we need to identify the positions where recombinations may have occurred, and develop algorithms to generate and exp...

  17. Reconstructing the history of dark energy using maximum entropy

    Science.gov (United States)

    Zunckel, Caroline; Trotta, Roberto

    2007-09-01

    We present a Bayesian technique based on a maximum-entropy method to reconstruct the dark energy equation of state (EOS) w(z) in a non-parametric way. This Maximum Entropy (MaxEnt) technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing Type Ia supernova measurement from the HST/GOODS programme and the first-year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillation data. We find that the SNLS data are compatible with w(z) = -1 at all redshifts 0 -1 at z ~ 0.5 and a drift towards w > -1 at larger redshifts which, however, is not robust with respect to changes in our prior specifications. We employ both a constant EOS prior model and a slowly varying w(z) and find that our conclusions are only mildly dependent on this choice at high redshifts. Our method highlights the danger of employing parametric fits for the unknown EOS, that can potentially miss or underestimate real structure in the data.

  18. Reconstructing the history of dark energy using maximum entropy

    CERN Document Server

    Zunckel, C

    2007-01-01

    We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS program and the first year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillations data. We find that the SNLS data are compatible with $w(z) = -1$ at all redshifts $0 \\leq z \\lsim 1100$, with errorbars of order 20% for the most constraining choice of priors and model. The HST/GOODS data exhibit a slight (about $1\\sigma$ significance) preference for $w>-1$ at $z\\sim 0.5$ and a drift towards $w>-1$ at larger redshifts, which however is not robust with respect to changes ...

  19. Bayesian 3D X-ray computed tomography image reconstruction with a scaled Gaussian mixture prior model

    International Nuclear Information System (INIS)

    In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 5123 to 81923 voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and Ht (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections

  20. Application of Bayesian Neural Networks to Energy Reconstruction in EAS Experiments for ground-based TeV Astrophysics

    CERN Document Server

    Bai, Ying; Lan, JieQin; Gao, WeiWei

    2016-01-01

    A toy detector array has been designed to simulate the detection of cosmic rays in Extended Air Shower(EAS) Experiments for ground-based TeV Astrophysics. The primary energies of protons from the Monte-Carlo simulation have been reconstructed by the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment\\cite{lhaaso-ma}, respectively. The result of the energy reconstruction using BNNs has been compared with the one using the standard method. Compared to the standard method, the energy resolutions are significantly improved using BNNs. And the improvement is more obvious for the high energy protons than the low energy ones.

  1. Benchmarking the Bayesian reconstruction of the non-perturbative heavy $Q\\bar{Q}$ potential

    CERN Document Server

    Burnier, Yannis

    2013-01-01

    The extraction of the finite temperature heavy quark potential from lattice QCD relies on a spectral analysis of the real-time Wilson loop. Through its position and shape, the lowest lying spectral peak encodes the real and imaginary part of this complex potential. We benchmark this extraction strategy using leading order hard-thermal loop (HTL) calculations. I.e. we analytically calculate the Wilson loop and determine the corresponding spectrum. By fitting its lowest lying peak we obtain the real- and imaginary part and confirm that the knowledge of the lowest peak alone is sufficient for obtaining the potential. We deploy a novel Bayesian approach to the reconstruction of spectral functions to HTL correlators in Euclidean time and observe how well the known spectral function and values for the real and imaginary part are reproduced. Finally we apply the method to quenched lattice QCD data and perform an improved estimate of both real and imaginary part of the non-perturbative heavy $Q\\bar{Q}$ potential.

  2. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  3. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    Energy Technology Data Exchange (ETDEWEB)

    Chan, M.T. [Univ. of Southern California, Los Angeles, CA (United States); Herman, G.T. [Univ. of Pennsylvania, Philadelphia, PA (United States); Levitan, E. [Technion, Haifa (Israel)

    1996-12-31

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an {open_quotes}image-modeling{close_quotes} prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach.

  4. Gene regulatory network reconstruction by Bayesian integration of prior knowledge and/or different experimental conditions.

    Science.gov (United States)

    Werhli, Adriano V; Husmeier, Dirk

    2008-06-01

    There have been various attempts to improve the reconstruction of gene regulatory networks from microarray data by the systematic integration of biological prior knowledge. Our approach is based on pioneering work by Imoto et al. where the prior knowledge is expressed in terms of energy functions, from which a prior distribution over network structures is obtained in the form of a Gibbs distribution. The hyperparameters of this distribution represent the weights associated with the prior knowledge relative to the data. We have derived and tested a Markov chain Monte Carlo (MCMC) scheme for sampling networks and hyperparameters simultaneously from the posterior distribution, thereby automatically learning how to trade off information from the prior knowledge and the data. We have extended this approach to a Bayesian coupling scheme for learning gene regulatory networks from a combination of related data sets, which were obtained under different experimental conditions and are therefore potentially associated with different active subpathways. The proposed coupling scheme is a compromise between (1) learning networks from the different subsets separately, whereby no information between the different experiments is shared; and (2) learning networks from a monolithic fusion of the individual data sets, which does not provide any mechanism for uncovering differences between the network structures associated with the different experimental conditions. We have assessed the viability of all proposed methods on data related to the Raf signaling pathway, generated both synthetically and in cytometry experiments. PMID:18574862

  5. A Bayesian fusion model for space-time reconstruction of finely resolved velocities in turbulent flows from low resolution measurements

    CERN Document Server

    Van Nguyen, Linh; Chainais, Pierre

    2015-01-01

    The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities ar...

  6. Reconstructing the invasion history of Heracleum persicum (Apiaceae) into Europe

    Czech Academy of Sciences Publication Activity Database

    Rijal, D. P.; Alm, T.; Jahodová, Šárka; Stenoien, H. K.; Alsos, I. G.

    2015-01-01

    Roč. 24, č. 22 (2015), s. 5522-5543. ISSN 0962-1083 Institutional support: RVO:67985939 Keywords : approximate Bayesian computation * genetic variation * population genetics Subject RIV: EH - Ecology, Behaviour Impact factor: 6.494, year: 2014

  7. Reconstructing the Population Genetic History of the Caribbean

    Science.gov (United States)

    Moreno-Estrada, Andrés; Gravel, Simon; Zakharia, Fouad; McCauley, Jacob L.; Byrnes, Jake K.; Gignoux, Christopher R.; Ortiz-Tello, Patricia A.; Martínez, Ricardo J.; Hedges, Dale J.; Morris, Richard W.; Eng, Celeste; Sandoval, Karla; Acevedo-Acevedo, Suehelay; Norman, Paul J.; Layrisse, Zulay; Parham, Peter; Martínez-Cruzado, Juan Carlos; Burchard, Esteban González; Cuccaro, Michael L.; Martin, Eden R.; Bustamante, Carlos D.

    2013-01-01

    The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola), two mainland (Honduras, Colombia), and three Native South American (Yukpa, Bari, and Warao) populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA) method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse—which today is reflected by shorter, older ancestry tracts—consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse—reflected by longer, younger tracts—is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub

  8. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Cai, C. [CEA, LIST, 91191 Gif-sur-Yvette, France and CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Rodet, T.; Mohammad-Djafari, A. [CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Legoupil, S. [CEA, LIST, 91191 Gif-sur-Yvette (France)

    2013-11-15

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  9. Application of Bayesian neural networks to energy reconstruction in EAS experiments for ground-based TeV astrophysics

    Science.gov (United States)

    Bai, Y.; Xu, Y.; Pan, J.; Lan, J. Q.; Gao, W. W.

    2016-07-01

    A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers than the low energy ones.

  10. Typical reconstruction performance for distributed compressed sensing based on ℓ2,1-norm regularized least square and Bayesian optimal reconstruction: influences of noise

    Science.gov (United States)

    Shiraki, Yoshifumi; Kabashima, Yoshiyuki

    2016-06-01

    A signal model called joint sparse model 2 (JSM-2) or the multiple measurement vector problem, in which all sparse signals share their support, is important for dealing with practical signal processing problems. In this paper, we investigate the typical reconstruction performance of noisy measurement JSM-2 problems for {{\\ell}2,1} -norm regularized least square reconstruction and the Bayesian optimal reconstruction scheme in terms of mean square error. Employing the replica method, we show that these schemes, which exploit the knowledge of the sharing of the signal support, can recover the signals more precisely as the number of channels increases. In addition, we compare the reconstruction performance of two different ensembles of observation matrices: one is composed of independent and identically distributed random Gaussian entries and the other is designed so that row vectors are orthogonal to one another. As reported for the single-channel case in earlier studies, our analysis indicates that the latter ensemble offers better performance than the former ones for the noisy JSM-2 problem. The results of numerical experiments with a computationally feasible approximation algorithm we developed for this study agree with the theoretical estimation.

  11. Avoiding spurious feedback loops in the reconstruction of gene regulatory networks with dynamic bayesian networks

    OpenAIRE

    Grzegorczyk, M.; Husmeier, D.

    2009-01-01

    Feedback loops and recurrent structures are essential to the regulation and stable control of complex biological systems. The application of dynamic as opposed to static Bayesian networks is promising in that, in principle, these feedback loops can be learned. However, we show that the widely applied BGe score is susceptible to learning spurious feedback loops, which are a consequence of non-linear regulation and autocorrelation in the data. We propose a non-linear generalisation of the BGe m...

  12. Reconstruction of large-scale gene regulatory networks using Bayesian model averaging.

    Science.gov (United States)

    Kim, Haseong; Gelenbe, Erol

    2012-09-01

    Gene regulatory networks provide the systematic view of molecular interactions in a complex living system. However, constructing large-scale gene regulatory networks is one of the most challenging problems in systems biology. Also large burst sets of biological data require a proper integration technique for reliable gene regulatory network construction. Here we present a new reverse engineering approach based on Bayesian model averaging which attempts to combine all the appropriate models describing interactions among genes. This Bayesian approach with a prior based on the Gibbs distribution provides an efficient means to integrate multiple sources of biological data. In a simulation study with maximum of 2000 genes, our method shows better sensitivity than previous elastic-net and Gaussian graphical models, with a fixed specificity of 0.99. The study also shows that the proposed method outperforms the other standard methods for a DREAM dataset generated by nonlinear stochastic models. In brain tumor data analysis, three large-scale networks consisting of 4422 genes were built using the gene expression of non-tumor, low and high grade tumor mRNA expression samples, along with DNA-protein binding affinity information. We found that genes having a large variation of degree distribution among the three tumor networks are the ones that see most involved in regulatory and developmental processes, which possibly gives a novel insight concerning conventional differentially expressed gene analysis. PMID:22987132

  13. Accurate Reconstruction of Insertion-Deletion Histories by Statistical Phylogenetics

    OpenAIRE

    Westesson, O; Lunter, G.; Paten, B; Holmes, I

    2012-01-01

    The Multiple Sequence Alignment (MSA) is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history), it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm) to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of seve...

  14. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole; Hansen, Lars Kai

    In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and...

  15. Reconstructing a School's Past Using Oral Histories and GIS Mapping.

    Science.gov (United States)

    Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna

    2000-01-01

    Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)

  16. Pollen-Climate Calibration, Characterization of Statistical Uncertainty, and Forward Modeling for Integration Into Bayesian Hierarchical Climate Reconstruction

    Science.gov (United States)

    Wahl, E. R.

    2008-12-01

    A strict process model for pollen as a climate proxy is currently not approachable beyond localized spatial scales; more generally, the canonical model for vegetation-pollen registration itself requires assimilation of empirically-derived information. In this paper, a taxonomically "reduced-space" climate-pollen forward model is developed, based on the performance of a parallel inverse model. The goal is inclusion of the forward model in a Bayesian climate reconstruction framework, following a 4-step process. (1) Ratios of pollen types calibrated to temperature are examined to determine if they can equal or surpass the skill of multi-taxonomic calibrations using the modern analog technique (MAT) optimized with receiver operating characteristic (ROC) analysis. The first phase of this examination, using modern pollen data from SW N America, demonstrates that the ratio method can give calibrations as skillful as the MAT when vegetation representation (and associated climate gradients) are characterized by two dominant pollen taxa, in this case pine and oak. Paleotemperature reconstructions using the ratio method also compare well to MAT reconstructions, showing very minor differences. [Ratio values are defined as pine/(pine + oak), so they vary between 0 and 1.] (2) Uncertainty analysis is carried out in independent steps, which are combined to give overall probabilistic confidence ranges. Monte Carlo (MC) analysis utilizing Poisson distributions to model the inherent variability of pollen representation in relation to climate (assuming defined temperature normals at the modern calibration sites) allows independent statistical estimation of this component of uncertainty, for both the modern calibration and fossil pollen data sets. In turn, MC analysis utilizing normal distributions allows independent estimation of the addition to overall uncertainty from climate variation itself. (3) Because the quality tests in (1) indicate the ratio method has the capacity to carry

  17. Holocene fire history reconstruction using Tibetan lacustrine sediments

    Science.gov (United States)

    Callegaro, Alice; Kirchgeorg, Torben; Battistel, Dario; Bird, Broxton; Barbante, Carlo

    2016-04-01

    The important role that biomass burning playsin influencing the Holocene'sclimate is still under discussion. The present work gives information about past biomass burning events in the Tibetan Plateau and helps to increase the understanding of the interaction between climate, humans and fire activity during Holocene. Asiatic area is one of the centers of the advent of agriculture and pastoralism, and it is a strategic area for understanding the interaction between human and fire during the Holocene. We reconstructed past biomass burning events and vegetation from sediments collected from lake Paru Co, a small moraine dammed lake located in the Tibetan Plateau at 4845 m above sea level. We extracted lake sediment samples by accelerate solvent extraction and analysed different organic molecular proxies by GC-MS and IC-MS. We used monosaccharide anhydrides, levoglucosan and its isomers, as proxies for biomass burning. These are specific molecular markers originated from the pyrolysis of cellulose showing significant fire events and indicate changes in burned fuel. Furthermore we analysed polycyclic aromatic hydrocarbons (PAH) as additional combustion proxies. For a better understanding of changes in vegetation andof human habitation at the lake shore we analysed n-alkanes and sterols. Comparing the data of this multi-proxy approach used in the studied area with climatic and meteorological literature data, reconstruction and contextualization of past fire events are possible: we can see the agreement between dry climate period and presence of more intense fire events, especially in the Early Holocene.

  18. Free Radicals in Organic Matter for Thermal History Reconstruction of Carbonate Succession

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Geothermometer is one of the most useful methods to reconstruct the thermal history of sedimentary basins. This paper introduces the application of free radicals concentration of organic matter as a thermal indicator in the thermal history reconstruction of carbonate succession, based on anhydrous thermal simulation results of type Ⅰ and Ⅱ1 kerogen. A series of free radicals data are obtained under thermal simulation of different heating temperatures and times, and quantitative models between free radical concentration (Ng) of organic matter and time-temperature index (TTI) for types Ⅰ and type Ⅱ1 kerogen are also obtained. This Ng- TTI relation was used to model the Ordovician thermal gradients of Well TZ12 in the Tarim Basin. The modeling result is corresponding to the results obtained by apatite fission track data and published data. This new method of thermal history reconstruction will be benefit to the hydrocarbon generation and accumulation study and resource assessment of carbonate succession.

  19. A Bayesian approach for suppression of limited angular sampling artifacts in single particle 3D reconstruction.

    Science.gov (United States)

    Moriya, Toshio; Acar, Erman; Cheng, R Holland; Ruotsalainen, Ulla

    2015-09-01

    In the single particle reconstruction, the initial 3D structure often suffers from the limited angular sampling artifact. Selecting 2D class averages of particle images generally improves the accuracy and efficiency of the reference-free 3D angle estimation, but causes an insufficient angular sampling to fill the information of the target object in the 3D frequency space. Similarly, the initial 3D structure by the random-conical tilt reconstruction has the well-known "missing cone" artifact. Here, we attempted to solve the limited angular sampling problem by sequentially applying maximum a posteriori estimate with expectation maximization algorithm (sMAP-EM). Using both simulated and experimental cryo-electron microscope images, the sMAP-EM was compared to the direct Fourier method on the basis of reconstruction error and resolution. To establish selection criteria of the final regularization weight for the sMAP-EM, the effects of noise level and sampling sparseness on the reconstructions were examined with evenly distributed sampling simulations. The frequency information filled in the missing cone of the conical tilt sampling simulations was assessed by developing new quantitative measurements. All the results of visual and numerical evaluations showed the sMAP-EM performed better than the direct Fourier method, regardless of the sampling method, noise level, and sampling sparseness. Furthermore, the frequency domain analysis demonstrated that the sMAP-EM can fill the meaningful information in the unmeasured angular space without detailed a priori knowledge of the objects. The current research demonstrated that the sMAP-EM has a high potential to facilitate the determination of 3D protein structures at near atomic-resolution. PMID:26193484

  20. MAGIC: Exact Bayesian Covariance Estimation and Signal Reconstruction for Gaussian Random Fields

    OpenAIRE

    Wandelt, Benjamin D.

    2004-01-01

    In this talk I describe MAGIC, an efficient approach to covariance estimation and signal reconstruction for Gaussian random fields (MAGIC Allows Global Inference of Covariance). It solves a long-standing problem in the field of cosmic microwave background (CMB) data analysis but is in fact a general technique that can be applied to noisy, contaminated and incomplete or censored measurements of either spatial or temporal Gaussian random fields. In this talk I will phrase the method in a way th...

  1. Collage as a Way to Reconstruct the History of Education

    OpenAIRE

    Elena Penskaja

    2012-01-01

    Elena Penskaja, D.Sc. in Philology, Dean of the Faculty of Philology, Head of the Literature Department, National Research University - Higher School of Economics, Moscow, Russian Federation. Email: The paper reviews the methods used to describe historical processes in education and analyzes examples of historical material falsification.The lack of qualitative studies in history of Russian education is a great hindrance to educational reforms. The author identifies and descr...

  2. Reconstructing the history of structure formation using redshift distortions

    OpenAIRE

    Song, Yong-Seon; Percival, Will

    2008-01-01

    Measuring the statistics of galaxy peculiar velocities using redshift-space distortions is an excellent way of probing the history of structure formation. Because galaxies are expected to act as test particles within the flow of matter, this method avoids uncertainties due to an unknown galaxy density bias. We show that the parameter combination measured by redshift-space distortions, $f\\sigma_8^{\\rm mass}$ provides a good test of dark energy models, even without the knowledge of bias or $\\si...

  3. Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.

    Science.gov (United States)

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C). PMID:24237510

  4. Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach

    International Nuclear Information System (INIS)

    We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with Nf = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV < T < 249MeV and deploys lattice regularized non-relativistic QCD (NRQCD) effective field theory to capture the physics of heavy quark bound states immersed therein. The spectral functions of the 3S1 (ϒ) and 3P1 (χb1) bottomonium states are extracted from Euclidean time Monte Carlo simulations using a novel Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χb1) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided

  5. A Comparison of Spatio-Temporal Bayesian Models for Reconstruction of Rainfall Fields in a Cloud Seeding Experiment

    Directory of Open Access Journals (Sweden)

    Sujit k. sahu

    2005-01-01

    Full Text Available In response to the drought experienced in Southern Italy a rain seeding project has been setup and developed during the years 1989-1994. The initiative was taken with the purpose of applying existing methods of rain enhancement technology to regions of south Italy including Puglia. The aim of this study is to provide statistical support for the evaluation of the experimental part of the project. In particular our aim is to reconstruct rainfall fields by combining two data sources: rainfall intensity as measured by ground raingauges and radar reflectivity. A difficulty in modeling the rainfall data here comes from rounding of many recorded rainguages. The rounding of the rainfall measurements make the data essentially discrete and models based on continuous distributions are not suitable for modeling these discrete data. In this study we extend two recently developed spatio-temporal models for continuous data to accommodate rounded rainfall measurements taking discrete values with positive probabilities. We use MCMC methods to implement the models and obtain forecasts in space and time together with their standard errors. We compare the two models using predictive Bayesian methods. The benefits of our modeling extensions are seen in accurate predictions of dry periods with no positive prediction standard errors.

  6. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Directory of Open Access Journals (Sweden)

    Alexander Tilley

    Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.

  7. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Science.gov (United States)

    Tilley, Alexander; López-Angarita, Juliana; Turner, John R

    2013-01-01

    The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15)N and (13)C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15)N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15)N values and greater δ(13)C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15)N ≈ 2.7‰ and Δ(13)C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions. PMID:24236144

  8. Bayesian field theoretic reconstruction of bond potential and bond mobility in single molecule force spectroscopy

    CERN Document Server

    Chang, Joshua C; Chou, Tom

    2015-01-01

    Quantifying the forces between and within macromolecules is a necessary first step in understanding the mechanics of molecular structure, protein folding, and enzyme function and performance. In such macromolecular settings, dynamic single-molecule force spectroscopy (DFS) has been used to distort bonds. The resulting responses, in the form of rupture forces, work applied, and trajectories of displacements, have been used to reconstruct bond potentials. Such approaches often rely on simple parameterizations of one-dimensional bond potentials, assumptions on equilibrium starting states, and/or large amounts of trajectory data. Parametric approaches typically fail at inferring complex-shaped bond potentials with multiple minima, while piecewise estimation may not guarantee smooth results with the appropriate behavior at large distances. Existing techniques, particularly those based on work theorems, also do not address spatial variations in the diffusivity that may arise from spatially inhomogeneous coupling to...

  9. Reconstruction of the History of the Photoelectric Effect and Its Implications for General Physics Textbooks

    Science.gov (United States)

    Niaz, Mansoor; Klassen, Stephen; McMillan, Barbara; Metz, Don

    2010-01-01

    The photoelectric effect is an important part of general physics textbooks. To study the presentation of this phenomenon, we have reconstructed six essential, history and philosophy of science (HPS)-related aspects of the events that culminated in Einstein proposing his hypothesis of lightquanta and the ensuing controversy within the scientific…

  10. Reconstruction to Progressivism: Booklet 3. Critical Thinking in American History. [Student Manual].

    Science.gov (United States)

    O'Reilly, Kevin

    One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents high school students with supplementary lessons on the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The student booklet begins with a guide to critical…

  11. Reconstruction to Progressivism: Booklet 3. Critical Thinking in American History. Teacher's Guide and Source Materials Envelope.

    Science.gov (United States)

    O'Reilly, Kevin

    One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents supplementary lesson plans for teaching high school students about the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The booklet begins with a guide to…

  12. Whole-History Rating: A Bayesian Rating System for Players of Time-Varying Strength

    OpenAIRE

    Coulom, Rémi

    2008-01-01

    Whole-History Rating (WHR) is a new method to estimate the time-varying strengths of players involved in paired comparisons. Like many variations of the Elo rating system, the whole-history approach is based on the dynamic Bradley-Terry model. But, instead of using incremental approximations, WHR directly computes the exact maximum a posteriori over the whole rating history of all players. This additional accuracy comes at a higher computational cost than traditional methods, but computation ...

  13. Reconstructing the history of structure formation using redshift distortions

    International Nuclear Information System (INIS)

    Measuring the statistics of galaxy peculiar velocities using redshift-space distortions is an excellent way of probing the history of structure formation. Because galaxies are expected to act as test particles within the flow of matter, this method avoids uncertainties due to an unknown galaxy density bias. We show that the parameter combination measured by redshift-space distortions, fσ8mass provides a good test of dark energy models, even without the knowledge of bias or σ8mass required to extract f from this measurement (here f is the logarithmic derivative of the linear growth rate, and σ8mass is the root-mean-square mass fluctuation in spheres with radius 8h−1Mpc). We argue that redshift-space distortion measurements will help to determine the physics behind the cosmic acceleration, testing whether it is related to dark energy or modified gravity, and will provide an opportunity to test possible dark energy clumping or coupling between dark energy and dark matter. If we can measure galaxy bias in addition, simultaneous measurement of both the overdensity and velocity fields can be used to test the validity of equivalence principle, through the continuity equation

  14. Metagenomic reconstructions of bacterial CRISPR loci constrain population histories.

    Science.gov (United States)

    Sun, Christine L; Thomas, Brian C; Barrangou, Rodolphe; Banfield, Jillian F

    2016-04-01

    Bacterial CRISPR-Cas systems provide insight into recent population history because they rapidly incorporate, in a unidirectional manner, short fragments (spacers) from coexisting infective virus populations into host chromosomes. Immunity is achieved by sequence identity between transcripts of spacers and their targets. Here, we used metagenomics to study the stability and dynamics of the type I-E CRISPR-Cas locus of Leptospirillum group II bacteria in biofilms sampled over 5 years from an acid mine drainage (AMD) system. Despite recovery of 452,686 spacers from CRISPR amplicons and metagenomic data, rarefaction curves of spacers show no saturation. The vast repertoire of spacers is attributed to phage/plasmid population diversity and retention of old spacers, despite rapid evolution of the targeted phage/plasmid genome regions (proto-spacers). The oldest spacers (spacers found at the trailer end) are conserved for at least 5 years, and 12% of these retain perfect or near-perfect matches to proto-spacer targets. The majority of proto-spacer regions contain an AAG proto-spacer adjacent motif (PAM). Spacers throughout the locus target the same phage population (AMDV1), but there are blocks of consecutive spacers without AMDV1 target sequences. Results suggest long-term coexistence of Leptospirillum with AMDV1 and periods when AMDV1 was less dominant. Metagenomics can be applied to millions of cells in a single sample to provide an extremely large spacer inventory, allow identification of phage/plasmids and enable analysis of previous phage/plasmid exposure. Thus, this approach can provide insights into prior bacterial environment and genetic interplay between hosts and their viruses. PMID:26394009

  15. Reconstructing the history of marriage strategies in Indo-European-speaking societies: monogamy and polygyny.

    Science.gov (United States)

    Fortunato, Laura

    2011-02-01

    Explanations for the emergence of monogamous marriage have focused on the cross-cultural distribution of marriage strategies, thus failing to account for their history. In this paper I reconstruct the pattern of change in marriage strategies in the history of societies speaking Indo-European languages, using cross-cultural data in the systematic and explicitly historical framework afforded by the phylogenetic comparative approach. The analysis provides evidence in support of Proto-Indo-European monogamy, and that this pattern may have extended back to Proto-Indo-Hittite. These reconstructions push the origin of monogamous marriage into prehistory, well beyond the earliest instances documented in the historical record; this, in turn, challenges notions that the cross-cultural distribution of monogamous marriage reflects features of social organization typically associated with Eurasian societies, and with "societal complexity" and "modernization" more generally. I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization. PMID:21453006

  16. Memory, History and Narrative: Shifts of Meaning when (Reconstructing the Past

    Directory of Open Access Journals (Sweden)

    Ignacio Brescó de Luna

    2012-05-01

    Full Text Available This paper is devoted to the examination of some socio-cultural dimensions of memory, focusing on narratives as a meditational tool (Vygotsky, 1978 for the construction of past events and attribution of meaning. The five elements of Kenneth Burke’s Grammar of Motives (1969 are taken as a framework for the examination of reconstructions of the past and particularly of histories, namely: 1 the interpretative and reconstructive action of 2 a positioned agent operating 3 through narrative means 4 addressed to particular purposes 5 within a concrete social and temporal scenery. The reflexive character of such approach opens the ground for considering remembering as one kind of act performed within the context of a set of on-going actions, so that remembrances play a directive role for action and so have an unavoidable moral dimension. This is particularly relevant for some kinds of social memory such as history teaching and their effects upon identity.

  17. Revised palaeogeographical reconstruction and avulsion history of the Holocene Rhine-Meuse delta, The Netherlands

    OpenAIRE

    Stouthamer, E.; Cohen, K.M.; Hoek, W.Z.; Pierik, H.J.; Taal, L.J.; Hijma, M.P.; Bos, I.J.

    2013-01-01

    In the Holocene Rhine-Meuse delta, the geography, architecture, and chronology of the channel belts and their flood basins is known in exceptional high detail. This is due to a long history of intensive geological, geomorphological, and archeological research by various universities and knowledge institutes and archaeological consultancy companies. A first reconstruction showing the build-up and palaeogeographical development of the delta in 500 year time-slices was published in 2001 by Beren...

  18. Reconstructing a website's lost past: Methodological issues concerning the history of www.unibo.it

    OpenAIRE

    Nanni, Federico

    2016-01-01

    This paper describes how born digital primary sources could be used to reconstruct the recent history of scientific institutions. The case study is an analysis of the first 25 years online of the University of Bologna. The focus of this work is primarily methodological: several different issues are presented, starting with the fact that the University of Bologna website has been excluded for thirteen years from the Internet Archive's Wayback Machine, and possible solutions are proposed and ap...

  19. Approximate Bayesian Computation: a useful approach for inferring population history and other parameters

    Czech Academy of Sciences Publication Activity Database

    Konečný, A.; Bryja, Josef

    Brno: Ústav biologie obratlovců AV ČR, 2012 - (Bryja, J.; Albrechtová, J.; Tkadlec, E.). s. 97 ISBN 978-80-87189-11-5. [Zoologické dny. 09.02.2012-10.02.2012, Olomouc] R&D Projects: GA ČR GAP506/10/0983 Institutional research plan: CEZ:AV0Z60930519 Keywords : colonization history * black rat Subject RIV: EG - Zoology

  20. msBayes: Pipeline for testing comparative phylogeographic histories using hierarchical approximate Bayesian computation

    Directory of Open Access Journals (Sweden)

    Takebayashi Naoki

    2007-07-01

    Full Text Available Abstract Background Although testing for simultaneous divergence (vicariance across different population-pairs that span the same barrier to gene flow is of central importance to evolutionary biology, researchers often equate the gene tree and population/species tree thereby ignoring stochastic coalescent variance in their conclusions of temporal incongruence. In contrast to other available phylogeographic software packages, msBayes is the only one that analyses data from multiple species/population pairs under a hierarchical model. Results msBayes employs approximate Bayesian computation (ABC under a hierarchical coalescent model to test for simultaneous divergence (TSD in multiple co-distributed population-pairs. Simultaneous isolation is tested by estimating three hyper-parameters that characterize the degree of variability in divergence times across co-distributed population pairs while allowing for variation in various within population-pair demographic parameters (sub-parameters that can affect the coalescent. msBayes is a software package consisting of several C and R programs that are run with a Perl "front-end". Conclusion The method reasonably distinguishes simultaneous isolation from temporal incongruence in the divergence of co-distributed population pairs, even with sparse sampling of individuals. Because the estimate step is decoupled from the simulation step, one can rapidly evaluate different ABC acceptance/rejection conditions and the choice of summary statistics. Given the complex and idiosyncratic nature of testing multi-species biogeographic hypotheses, we envision msBayes as a powerful and flexible tool for tackling a wide array of difficult research questions that use population genetic data from multiple co-distributed species. The msBayes pipeline is available for download at http://msbayes.sourceforge.net/ under an open source license (GNU Public License. The msBayes pipeline is comprised of several C and R programs that

  1. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  2. Understanding the recent colonization history of a plant pathogenic fungus using population genetic tools and Approximate Bayesian Computation.

    Science.gov (United States)

    Barrès, B; Carlier, J; Seguin, M; Fenouillet, C; Cilas, C; Ravigné, V

    2012-11-01

    Understanding the processes by which new diseases are introduced in previously healthy areas is of major interest in elaborating prevention and management policies, as well as in understanding the dynamics of pathogen diversity at large spatial scale. In this study, we aimed to decipher the dispersal processes that have led to the emergence of the plant pathogenic fungus Microcyclus ulei, which is responsible for the South American Leaf Blight (SALB). This fungus has devastated rubber tree plantations across Latin America since the beginning of the twentieth century. As only imprecise historical information is available, the study of population evolutionary history based on population genetics appeared most appropriate. The distribution of genetic diversity in a continental sampling of four countries (Brazil, Ecuador, Guatemala and French Guiana) was studied using a set of 16 microsatellite markers developed specifically for this purpose. A very strong genetic structure was found (F(st)=0.70), demonstrating that there has been no regular gene flow between Latin American M. ulei populations. Strong bottlenecks probably occurred at the foundation of each population. The most likely scenario of colonization identified by the Approximate Bayesian Computation (ABC) method implemented in DIYABC suggested two independent sources from the Amazonian endemic area. The Brazilian, Ecuadorian and Guatemalan populations might stem from serial introductions through human-mediated movement of infected plant material from an unsampled source population, whereas the French Guiana population seems to have arisen from an independent colonization event through spore dispersal. PMID:22828899

  3. Gene Transfer and the Reconstruction of Life's Early History from Genomic Data

    Science.gov (United States)

    Gogarten, J. Peter; Fournier, Gregory; Zhaxybayeva, Olga

    2008-03-01

    The metaphor of the unique and strictly bifurcating tree of life, suggested by Charles Darwin, needs to be replaced (or at least amended) to reflect and include processes that lead to the merging of and communication between independent lines of descent. Gene histories include and reflect processes such as gene transfer, symbioses and lineage fusion. No single molecule can serve as a proxy for the tree of life. Individual gene histories can be reconstructed from the growing molecular databases containing sequence and structural information. With some simplifications these gene histories can be represented by furcating trees; however, merging these gene histories into web-like organismal histories, including the transfer of metabolic pathways and cell biological innovations from now-extinct lineages, has yet to be accomplished. Because of these difficulties in interpreting the record retained in molecular sequences, correlations with biochemical fossils and with the geological record need to be interpreted with caution. Advances to detect and pinpoint transfer events promise to untangle at least a few of the intertwined histories of individual genes within organisms and trace them to the organismal ancestors. Furthermore, analysis of the shape of molecular phylogenetic trees may point towards organismal radiations that might reflect early mass extinction events that occurred on a planetary scale.

  4. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  5. Reconstructing the history of residence strategies in Indo-European-speaking societies: neo-, uxori-, and virilocality.

    Science.gov (United States)

    Fortunato, Laura

    2011-02-01

    Linguists and archaeologists have used reconstructions of early Indo-European residence strategies to constrain hypotheses about the homeland and trajectory of dispersal of Indo-European languages; however, these reconstructions are largely based on unsystematic and a historical use of the linguistic and ethnographic evidence, coupled with substantial bias in interpretation. Here I use cross-cultural data in a phylogenetic comparative framework to reconstruct the pattern of change in residence strategies in the history of societies speaking Indo-European languages. The analysis provides evidence in support of prevailing virilocality with alternative neolocality for Proto-Indo-European, and that this pattern may have extended back to Proto-Indo-Hittite. These findings bolster interpretations of the archaeological evidence that emphasize the "non-matricentric" structure of early Indo-European society; however, they also counter the notion that early Indo-European society was strongly "patricentric." I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization. PMID:21453007

  6. An Object-Based Approach for Fire History Reconstruction by Using Three Generations of Landsat Sensors

    Directory of Open Access Journals (Sweden)

    Thomas Katagis

    2014-06-01

    Full Text Available In this study, the capability of geographic object-based image analysis (GEOBIA in the reconstruction of the recent fire history of a typical Mediterranean area was investigated. More specifically, a semi-automated GEOBIA procedure was developed and tested on archived and newly acquired Landsat Multispectral Scanner (MSS, Thematic Mapper (TM, and Operational Land Imager (OLI images in order to accurately map burned areas in the Mediterranean island of Thasos. The developed GEOBIA ruleset was built with the use of the TM image and then applied to the other two images. This process of transferring the ruleset did not require substantial adjustments or any replacement of the initially selected features used for the classification, thus, displaying reduced complexity in processing the images. As a result, burned area maps of very high accuracy (over 94% overall were produced. In addition to the standard error matrix, the employment of additional measures of agreement between the produced maps and the reference data revealed that “spatial misplacement” was the main source of classification error. It can be concluded that the proposed approach can be potentially used for reconstructing the recent (40-year fire history in the Mediterranean, based on extended time series of Landsat or similar data.

  7. Bayesian prediction and adaptive sampling algorithms for mobile sensor networks online environmental field reconstruction in space and time

    CERN Document Server

    Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata

    2016-01-01

    This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...

  8. A general reconstruction of the recent expansion history of the universe

    Science.gov (United States)

    Vitenti, S. D. P.; Penna-Lima, M.

    2015-09-01

    Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary (n+1)-knots spline interpolation. We carry out a Monte Carlo (MC) analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and H(z) data. The bias-variance analysis is done for three fiducial models with different features in the deceleration curve. We perform the MC analysis generating mock catalogs and computing their best-fit. For each fiducial model, we test different reconstructions using, in each case, more than 104 catalogs in a total of about 5× 105. This investigation proved to be essential in determining the best reconstruction to study these data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most 10% of the total uncertainty. In all statistical analyses, we fit the coefficients of the deceleration function along with four nuisance parameters of the supernova astrophysical model. For the full sample, we also fit H0 and the sound horizon rs(zd) at the drag redshift. The bias-variance trade-off analysis shows that, apart from the deceleration function, all other estimators are unbiased. Finally, we apply the Ensemble Sampler Markov Chain Monte Carlo (ESMCMC) method to explore the posterior of the deceleration function up to redshift 1.3 (using only JLA) and 2.3 (JLA

  9. Evolutionary history of assassin bugs (insecta: hemiptera: Reduviidae: insights from divergence dating and ancestral state reconstruction.

    Directory of Open Access Journals (Sweden)

    Wei Song Hwang

    Full Text Available Assassin bugs are one of the most successful clades of predatory animals based on their species numbers (∼6,800 spp. and wide distribution in terrestrial ecosystems. Various novel prey capture strategies and remarkable prey specializations contribute to their appeal as a model to study evolutionary pathways involved in predation. Here, we reconstruct the most comprehensive reduviid phylogeny (178 taxa, 18 subfamilies to date based on molecular data (5 markers. This phylogeny tests current hypotheses on reduviid relationships emphasizing the polyphyletic Reduviinae and the blood-feeding, disease-vectoring Triatominae, and allows us, for the first time in assassin bugs, to reconstruct ancestral states of prey associations and microhabitats. Using a fossil-calibrated molecular tree, we estimated divergence times for key events in the evolutionary history of Reduviidae. Our results indicate that the polyphyletic Reduviinae fall into 11-14 separate clades. Triatominae are paraphyletic with respect to the reduviine genus Opisthacidius in the maximum likelihood analyses; this result is in contrast to prior hypotheses that found Triatominae to be monophyletic or polyphyletic and may be due to the more comprehensive taxon and character sampling in this study. The evolution of blood-feeding may thus have occurred once or twice independently among predatory assassin bugs. All prey specialists evolved from generalist ancestors, with multiple evolutionary origins of termite and ant specializations. A bark-associated life style on tree trunks is ancestral for most of the lineages of Higher Reduviidae; living on foliage has evolved at least six times independently. Reduviidae originated in the Middle Jurassic (178 Ma, but significant lineage diversification only began in the Late Cretaceous (97 Ma. The integration of molecular phylogenetics with fossil and life history data as presented in this paper provides insights into the evolutionary history of

  10. Archaeology of fire: Methodological aspects of reconstructing fire history of prehistoric archaeological sites

    Science.gov (United States)

    Alperson-Afil, Nira

    2012-07-01

    Concepts which are common in the reconstruction of fire histories are employed here for the purpose of interpreting fires identified at archaeological sites. When attempting to evaluate the fire history of ancient occupations we are limited by the amount and quality of the available data. Furthermore, the identification of archaeological burned materials, such as stone, wood, and charcoal, is adequate for the general assumption of a "fire history", but the agent responsible - anthropogenic or natural - cannot be inferred from the mere presence of burned items. The large body of scientific data that has accumulated, primarily through efforts to prevent future fire disasters, enables us to reconstruct scenarios of past natural fires. Adopting this line of thought, this paper attempts to evaluate the circumstances in which a natural fire may have ignited and spread at the 0.79 Ma occupation site of Gesher Benot Ya'aqov (Israel), resulting with burned wood and burned flint within the archaeological layers. At Gesher Benot Ya'aqov, possible remnants of hearths are explored through analyses of the spatial distribution of burned flint-knapping waste products. These occur in dense clusters in each of the archaeological occupations throughout the long stratigraphic sequence. In this study, the combination between the spatial analyses results, paleoenvironmental information, and various factors involved in the complex process of fire ignition, combustion, and behavior, has enabled the firm rejection of recurrent natural fires as the responsible agent for the burned materials. In addition, it suggested that mainly at early sites, where evidence for burning is present yet scarce, data on fire ecology can be particularly useful when it is considered in relation to paleoenvironmental information.

  11. The ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) Project brings history to life!

    Science.gov (United States)

    Avanzo, Salvatore; Barbera, Roberto; de Mattia, Francesco; Rocca, Giuseppe La; Sorrentino, Mariapaola; Vicinanza, Domenico

    ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) is a project coordinated at Conservatory of Music of Parma which aims to bring history to life. Ancient musical instruments can now be heard for the first time in hundreds of years, thanks to the successful synergy between art/humanities and science. The Epigonion, an instrument of the past, has been digitally recreated using gLite, an advanced middleware developed in the context of the EGEE project and research networks such as GÉANT2 in Europe and EUMEDCONNECT2 in the Mediterranean region. GÉANT2 and EUMEDCONNECT2, by connecting enormous and heterogeneous computing resources, provided the needed infrastructures to speed up the overall computation time and enable the computer-intensive modeling of musical sounds. This paper summarizes the most recent outcomes of the project underlining how the Grid aspect of the computation can support the Cultural Heritage community.

  12. A phylogenetic reconstruction of the epidemiological history of canine rabies virus variants in Colombia.

    Science.gov (United States)

    Hughes, Gareth J; Páez, Andrés; Bóshell, Jorge; Rupprecht, Charles E

    2004-03-01

    Historically, canine rabies in Colombia has been caused by two geographically distinct canine variants of rabies virus (RV) which between 1992 and 2002 accounted for approximately 95% of Colombian rabies cases. Genetic variant 1 (GV1) has been isolated up until 1997 in the Central Region and the Department of Arauca, and is now considered extinct through a successful vaccination program. Genetic variant 2 (GV2) has been isolated from the northern Caribbean Region and continues to circulate at present. Here we have analyzed two sets of sequence data based upon either a 147 nucleotide region of the glycoprotein (G) gene or a 258 nucleotide region that combines a fragment of the non-coding intergenic region and a fragment of the polymerase gene. Using both maximum likelihood (ML) and Markov chain Monte Carlo (MCMC) methods we have estimated the time of the most recent common ancestor (MRCA) of the two variants to be between 1983 and 1988. Reconstructions of the population history suggest that GV2 has been circulating in Colombia since the 1960s and that GV1 evolved as a separate lineage from GV2. Estimations of the effective population size at present show the GV2 outbreak to be approximately 20 times greater than that of GV1. Demographic reconstructions were unable to detect a decrease in population size concurrent with the elimination of GV1. We find a raised rate of nucleotide substitution for GV1 gene sequences when compared to that of GV2, although all estimates have wide confidence limits. We demonstrate that phylogenetic reconstructions and sequence analysis can be used to support incidence data from the field in the assessment of RV epidemiology. PMID:15019589

  13. An alternative approach to reconstructing organic matter accumulation with contrasting watershed disturbance histories from lake sediments

    International Nuclear Information System (INIS)

    A number of proxies, including carbon to nitrogen ratio (C:N) and stable isotopes (δ13C and δ15N), have been used to reconstruct organic matter (OM) profiles from lake sediments and these proxies individually or in combination cannot clearly discriminate different sources. Here we present an alternative approach to elucidate this problem from lake sediments as a function of watershed scale land use changes. Stable isotope signatures of defined OM sources from the study watersheds, Shawnigan Lake (SHL) and Elk Lake (ELL), were compared with sedimentary proxy records. Results from this study reveal that terrestrial inputs and catchment soil coinciding with the watershed disturbances histories probably contributed in recent trophic enrichment in SHL. In contrast, cultural eutrophication in ELL was partially a result of input from catchment soil (agricultural activities) with significant input from lake primary production as well. Results were consistent in both IsoSource (IsoSource version 1.2 is a Visual Basic program used for source separation, ( (http://www.epa.gov/wed/pages/models/isosource/isosource.htm)) and discriminant analysis (statistical classification technique). - The study shows an alternative approach to reconstruct organic matter accumulation using stable isotopes from lake sediments

  14. A general reconstruction of the recent expansion history of the universe

    CERN Document Server

    Vitenti, S D P

    2015-01-01

    Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary $(n+1)$-knots spline interpolation. We carry out a Monte Carlo analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and $H(z)$ data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most $10\\%$ of the total uncertainty. In...

  15. Reconstruction of atmospheric soot history in inland regions from lake sediments over the past 150 years

    Science.gov (United States)

    Han, Y. M.; Wei, C.; Huang, R.-J.; Bandowe, B. A. M.; Ho, S. S. H.; Cao, J. J.; Jin, Z. D.; Xu, B. Q.; Gao, S. P.; Tie, X. X.; An, Z. S.; Wilcke, W.

    2016-01-01

    Historical reconstruction of atmospheric black carbon (BC, in the form of char and soot) is still constrained for inland areas. Here we determined and compared the past 150-yr records of BC and polycyclic aromatic compounds (PACs) in sediments from two representative lakes, Huguangyan (HGY) and Chaohu (CH), in eastern China. HGY only receives atmospheric deposition while CH is influenced by riverine input. BC, char, and soot have similar vertical concentration profiles as PACs in both lakes. Abrupt increases in concentrations and mass accumulation rates (MARs) of soot have mainly occurred since ~1950, the establishment of the People’s Republic of China, when energy usage changed to more fossil fuel contributions reflected by the variations in the concentration ratios of char/soot and individual PACs. In HGY, soot MARs increased by ~7.7 times in the period 1980-2012 relative to the period 1850-1950. Similar increases (~6.7 times) were observed in CH. The increase in soot MARs is also in line with the emission inventory records in the literature and the fact that the submicrometer-sized soot particles can be dispersed regionally. The study provides an alternative method to reconstruct the atmospheric soot history in populated inland areas.

  16. Confederate Immigration to Brazil: A Cross-Cultural Approach to Reconstruction and Public History

    Directory of Open Access Journals (Sweden)

    Karina Esposito

    2015-12-01

    Full Text Available Given the interconnectedness of the contemporary world, it is imperative that historians place their studies within a global context, connecting domestic and foreign events in order to offer a thorough picture of the past. As historians, we should aim at exploring transnational connections in our published research and incorporating the same methodologies in the classroom, as well as in the field of Public History. Cross-cultural collaboration and transnational studies are challenging, but exceptionally effective approaches to developing a comprehensive understanding of the past and connecting people to their history. Important recent scholarship has placed the American Civil War in a broad international and transnational context. This article argues for the importance of continuing this trend, pointing to a unique case study: the confederate migration to Brazil during and after the Civil War. This episode can help us understand the international impact of the War in the western hemisphere. These confederates attempted to preserve some aspects of their Southern society by migrating to Brazil, one of the remaining slaveholding societies in the hemisphere at the time. Moreover, the descendants that remained in Brazil have engaged in a unique process of remembering and commemorating their heritage over the years. Exploring this migration will enhance Civil War and Reconstruction historiography, as well as commemoration, heritage and memory studies.

  17. Phylogenetic relationships and demographic histories of the Atherinidae in the Eastern Atlantic and Mediterranean Sea re-examined by Bayesian inference.

    Science.gov (United States)

    Pujolar, J M; Zane, L; Congiu, L

    2012-06-01

    The aim of our study is to examine the phylogenetic relationship, divergence times and demographic history of the five close-related Mediterranean and North-eastern Atlantic species/forms of Atherina using the full Bayesian framework for species tree estimation recently implemented in ∗BEAST. The inference is made possible by multilocus data using three mitochondrial genes (12S rRNA, 16S rRNA, control region) and one nuclear gene (rhodopsin) from multiple individuals per species available in GenBank. Bayesian phylogenetic analysis of the complete gene dataset produced a tree with strong support for the monophyly of each species, as well as high support for higher level nodes. An old origin of the Atherina group was suggested (19.2 MY), with deep split events within the Atherinidae predating the Messinian Salinity Crisis. Regional genetic substructuring was observed among populations of A. boyeri, with AMOVA and MultiDimensional Scaling suggesting the existence of five groupings (Atlantic/West Mediterranean, Adriatic, Greece, Black Sea and Tunis). The level of subdivision found might be consequence of the hydrographic isolation within the Mediterranean Sea. Bayesian inference of past demographic histories showed a clear signature of demographic expansion for the European coast populations of A. presbyter, possibly linked to post-glacial colonizations, but not for the Azores/Canary Islands, which is expected in isolated populations because of the impossibility of finding new habitats. Within the Mediterranean, signatures of recent demographic expansion were only found for the Adriatic population of A. boyeri, which could be associated with the relatively recent emergence of the Adriatic Sea. PMID:22425706

  18. Denudation History and Paleogeographic Reconstruction of the Phanerozoic of southern Mantiqueira Province, Brazil

    Science.gov (United States)

    Jelinek, A. R.; Chemale, F., Jr.

    2012-12-01

    In this work we deal with the Phanerozoic history of the Southern Mantiqueira Province and adjacent areas after the orogen-collapse of the Brasiliano orogenic mountains in southern Brazil and Uruguay, based on thermocronological data (fission track and U-Th/He on apatite) and thermal history modelling. During the Paleozoic intraplate sedimentary basins formed mainly bordering the orogenic systems, and thus, these regions have not been overprinted by younger orogenic processes. In the Mesocenozoic this region was affected by later fragmentation and dispersal due to the separation of South America and Africa. Denudation history of both margins quantified on the basis of thermal history modeling of apatite fission track thermocronology indicates that the margin of southeastern Brazil and Uruguay presented a minimum 3.5 to 4.5 Km of denudation, which included the main exposure area of the Brasiliano orogenic belts and adjacent areas. The Phanerozoic evolution of the West Gondawana is thus recorded first by the orogenetic collapses of the Brasiliano and Pan-African belts, at that time formed a single mountain system in the Cambrian-Ordovician period. Subsequentlly, formed the intraplate basins as Paraná, in southeastern Brazil, and Congo and some records of the Table Mountains Group and upper section of Karoo units, in Southwestern Africa. In Permotriassic period, the collision of the Cape Fold Belt and Sierra de la Ventana Belt at the margins of the West Gondwana supercontinent resulted an elastic deformation in the cratonic areas, where the intraplate depositional basin occurred, and also subsidence and uplift of the already established Pan-African-Brasiliano Belts. Younger denudation events, due to continental margin uplift and basin subsidence, occurred during the rifting and dispersal of the South America and Africa plates, which can be very well defined by the integration of the passive-margin sedimentation of the Pelotas and Santos basins and apatite fission

  19. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    Science.gov (United States)

    Burnier, Yannis; Kaczmarek, Olaf; Rothkopf, Alexander

    2016-01-01

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.

  20. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    International Nuclear Information System (INIS)

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints

  1. Gene regulatory network reconstruction using Bayesian networks, the Dantzig Selector, the Lasso and their meta-analysis.

    Directory of Open Access Journals (Sweden)

    Matthieu Vignes

    Full Text Available Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth "Dialogue for Reverse Engineering Assessments and Methods" (DREAM5 challenges are aimed at assessing methods and associated algorithms devoted to the inference of biological networks. Challenge 3 on "Systems Genetics" proposed to infer causal gene regulatory networks from different genetical genomics data sets. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis of predicted networks in terms of structure and reliability. The developed meta-analysis was ranked first among the 16 teams participating in Challenge 3A. It paves the way for future extensions of our inference method and more accurate gene network estimates in the context of genetical genomics.

  2. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    CERN Document Server

    Burnier, Yannis; Rothkopf, Alexander

    2014-01-01

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched $32^3\\times N_\\tau$ $(\\beta=7.0,\\xi=3.5)$ lattices with $N_\\tau=24,...,96$, which cover $839{\\rm MeV} \\geq T\\geq 210 {\\rm MeV}$. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize $N_f=2+1$ ASQTAD lattices with $m_l=m_s/20$ by the HotQCD collaboration, giving access to temperatures between $286 {\\rm MeV} \\geq T\\geq 148{\\rm MeV}$. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinem...

  3. Recent Forest Disturbance History in the Greater Yellowstone Ecosystem Reconstructed using Remote Sensing and Management Record

    Science.gov (United States)

    Zhao, F.; Huang, C.; Zhu, Z.

    2014-12-01

    The Greater Yellowstone Ecosystem (GYE), located in Central Rocky Mountains of United States, is of complex ecological and land management histories along with different land ownerships. What are effects of the different land management practices (such as those by national parks vs. national forests) on ecosystem disturbances and carbon balance? We present here the methods and results of a study on forest disturbance history over the GYE from 1984 to 2010 reconstructed from Landsat time series stacks and local management records. Annual forest fire, harvest and other disturbances were tracked and separated by integrating a model called Vegetation Change Tracker and the Support Vector Machine algorithm. Local management records were separated into training and validation data for the disturbance maps. Area statistics and rates of disturbances were quantified and compared across GYE land ownership over the multi-decade period and interpreted for implications of these changes for forest management and carbon analysis. Our results indicate that during the study interval (1984 - 2010), GYE National Parks (NPs) and Wilderness Area (WA) had higher percentages of area of forests disturbed compared to GYE National Forests (NF). Within the GYE NPs, over 45% of the forest lands were disturbed at least once during the study period, the majority (37%) was by wildfire. For GYE wilderness area, the total disturbance was 30% of forest with 19.4% by wildfire and 10.6% by other disturbances. In Bridger-Teton NF, 14.7% of forest was disturbed and 3.6%, 0.5% and 10.6% of forest were disturbed by fire, harvest and other disturbances, respectively. For Caribou-Targhee NF, 25% of total forest was disturbed during this time interval and 1.5%, 6.4% and 17.1% of forest were disturbed by fire, harvest and other disturbances, respectively.

  4. RECONSTRUCTING THE PHOTOMETRIC LIGHT CURVES OF EARTH AS A PLANET ALONG ITS HISTORY

    International Nuclear Information System (INIS)

    By utilizing satellite-based estimations of the distribution of clouds, we have studied Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation, and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the landmass distributions were different from today's. With this information, we have been able to simulate the globally integrated photometric variability of the planet at these epochs. We find that our simple model reproduces well the observed cloud distribution and albedo variability of the modern Earth. Moreover, the model suggests that the photometric variability of the Earth was probably much larger in past epochs. This enhanced photometric variability could improve the chances for the difficult determination of the rotational period and the identification of continental landmasses for a distant planets.

  5. Quantitative study on pollen-based reconstructions of vegetation history from central Canada

    Institute of Scientific and Technical Information of China (English)

    HART; Catherina; VETTER; Mary; SAUCHYN; David

    2008-01-01

    Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W, 40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem II. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.

  6. Reconstructing the complex evolutionary history of mobile plasmids in red algal genomes.

    Science.gov (United States)

    Lee, JunMo; Kim, Kyeong Mi; Yang, Eun Chan; Miller, Kathy Ann; Boo, Sung Min; Bhattacharya, Debashish; Yoon, Hwan Su

    2016-01-01

    The integration of foreign DNA into algal and plant plastid genomes is a rare event, with only a few known examples of horizontal gene transfer (HGT). Plasmids, which are well-studied drivers of HGT in prokaryotes, have been reported previously in red algae (Rhodophyta). However, the distribution of these mobile DNA elements and their sites of integration into the plastid (ptDNA), mitochondrial (mtDNA), and nuclear genomes of Rhodophyta remain unknown. Here we reconstructed the complex evolutionary history of plasmid-derived DNAs in red algae. Comparative analysis of 21 rhodophyte ptDNAs, including new genome data for 5 species, turned up 22 plasmid-derived open reading frames (ORFs) that showed syntenic and copy number variation among species, but were conserved within different individuals in three lineages. Several plasmid-derived homologs were found not only in ptDNA but also in mtDNA and in the nuclear genome of green plants, stramenopiles, and rhizarians. Phylogenetic and plasmid-derived ORF analyses showed that the majority of plasmid DNAs originated within red algae, whereas others were derived from cyanobacteria, other bacteria, and viruses. Our results elucidate the evolution of plasmid DNAs in red algae and suggest that they spread as parasitic genetic elements. This hypothesis is consistent with their sporadic distribution within Rhodophyta. PMID:27030297

  7. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  8. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    Energy Technology Data Exchange (ETDEWEB)

    Burnier, Yannis [Institut de Théorie des Phénomènes Physiques, Ecole Polytechnique Fédérale de Lausanne, CH-1015, Lausanne (Switzerland); Kaczmarek, Olaf [Fakultät für Physik, Universität Bielefeld, D-33615 Bielefeld (Germany); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, D-69120 Heidelberg (Germany)

    2016-01-22

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 32{sup 3} × N{sub τ} (β = 7.0, ξ = 3.5) lattices with N{sub τ} = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize N{sub f} = 2 + 1 ASQTAD lattices with m{sub l} = m{sub s}/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.

  9. Bayesian probabilistic sensitivity analysis of Markov models for natural history of a disease: an application for cervical cancer

    Directory of Open Access Journals (Sweden)

    Giulia Carreras

    2012-09-01

    Full Text Available

    Background: parameter uncertainty in the Markov model’s description of a disease course was addressed. Probabilistic sensitivity analysis (PSA is now considered the only tool that properly permits parameter uncertainty’s examination. This consists in sampling values from the parameter’s probability distributions.

    Methods: Markov models fitted with microsimulation were considered and methods for carrying out a PSA on transition probabilities were studied. Two Bayesian solutions were developed: for each row of the modeled transition matrix the prior distribution was assumed as a product of Beta or a Dirichlet. The two solutions differ in the source of information: several different sources for each transition in the Beta approach and a single source for each transition from a given health state in the Dirichlet. The two methods were applied to a simple cervical cancer’s model.

    Results : differences between posterior estimates from the two methods were negligible. Results showed that the prior variability highly influence the posterior distribution.

    Conclusions: the novelty of this work is the Bayesian approach that integrates the two distributions with a product of Binomial distributions likelihood. Such methods could be also applied to cohort data and their application to more complex models could be useful and unique in the cervical cancer context, as well as in other disease modeling.

  10. Significance of "stretched" mineral inclusions for reconstructing P- T exhumation history

    Science.gov (United States)

    Ashley, Kyle T.; Darling, Robert S.; Bodnar, Robert J.; Law, Richard D.

    2015-06-01

    Analysis of mineral inclusions in chemically and physically resistant hosts has proven to be valuable for reconstructing the P- T exhumation history of high-grade metamorphic rocks. The occurrence of cristobalite-bearing inclusions in garnets from Gore Mountain, New York, is unexpected because the peak metamorphic conditions reached are well removed (>600 °C too cold) from the stability field of this low-density silica polymorph that typically forms in high temperature volcanic environments. A previous study of samples from this area interpreted polymineralic inclusions consisting of cristobalite, albite and ilmenite as representing crystallized droplets of melt generated during a garnet-in reaction, followed by water loss from the inclusion to explain the reduction in inclusion pressure that drove the transformation of quartz to cristobalite. However, the recent discovery of monomineralic inclusions of cristobalite from the nearby Hooper Mine cannot be explained by this process. For these inclusions, we propose that the volume response to pressure and temperature changes during exhumation to Earth's surface resulted in large tensile stresses within the silica phase that would be sufficient to cause transformation to the low-density (low-pressure) form. Elastic modeling of other common inclusion-host systems suggests that this quartz-to-cristobalite example may not be a unique case. The aluminosilicate polymorph kyanite also has the capacity to retain tensile stresses if exhumed to Earth's surface after being trapped as an inclusion in plagioclase at P- T conditions within the kyanite stability field, with the stresses developed during exhumation sufficient to produce a transformation to andalusite. These results highlight the elastic environment that may arise during exhumation and provide a potential explanation of observed inclusions whose stability fields are well removed from P- T paths followed during exhumation.

  11. Reconstructing the Solar Wind from Its Early History to Current Epoch

    Science.gov (United States)

    Airapetian, Vladimir S.; Usmanov, Arcadi V.

    2016-02-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.

  12. Reconstructing the history of Mesoamerican populations through the study of the mitochondrial DNA control region.

    Directory of Open Access Journals (Sweden)

    Amaya Gorostiza

    Full Text Available The study of genetic information can reveal a reconstruction of human population's history. We sequenced the entire mtDNA control region (positions 16.024 to 576 following Cambridge Reference Sequence, CRS of 605 individuals from seven Mesoamerican indigenous groups and one Aridoamerican from the Greater Southwest previously defined, all of them in present Mexico. Samples were collected directly from the indigenous populations, the application of an individual survey made it possible to remove related or with other origins samples. Diversity indices and demographic estimates were calculated. Also AMOVAs were calculated according to different criteria. An MDS plot, based on FST distances, was also built. We carried out the construction of individual networks for the four Amerindian haplogroups detected. Finally, barrier software was applied to detect genetic boundaries among populations. The results suggest: a common origin of the indigenous groups; a small degree of European admixture; and inter-ethnic gene flow. The process of Mesoamerica's human settlement took place quickly influenced by the region's orography, which development of genetic and cultural differences facilitated. We find the existence of genetic structure is related to the region's geography, rather than to cultural parameters, such as language. The human population gradually became fragmented, though they remained relatively isolated, and differentiated due to small population sizes and different survival strategies. Genetic differences were detected between Aridoamerica and Mesoamerica, which can be subdivided into "East", "Center", "West" and "Southeast". The fragmentation process occurred mainly during the Mesoamerican Pre-Classic period, with the Otomí being one of the oldest groups. With an increased number of populations studied adding previously published data, there is no change in the conclusions, although significant genetic heterogeneity can be detected in Pima and

  13. Reconstructing the geological history of the Egyptian Nile. Aswan - Kom Ombo phase

    International Nuclear Information System (INIS)

    Complete text of publication follows. The Nile is the longest river in the world, stretching north for approximately 4,000 miles from East Africa to the Mediterranean. Over the past several millions of years the Nile gradually has changed its location and size. Reconstructing the geological history of the Nile and identifying the location of abandoned and now buried paleo-channels and deltas, is an essential step in constructing a land use maps. An initial study area between Aswan and Kom Ombo, Egypt was selected for a geologic and geophysical field survey supported with interpretation of Landsat TM, ASTER and radar SIR-C/X-SAR images. Simultaneously, gravity and magnetotelluric data were acquired along two traverses; one following Wadi Abu Subbaira, east of the Nile while the other one across the Wadi Kubania pre-Nile drainage system, to the west. Gravity data were collected using a Scintrex CG-5 gravimeter and a differential GPS whereas the magnetotelluric data were collected using a controlled source audio magnetotelluric stratagem system. Integration of geologic field mapping, geophysical investigations, and interpretation of different types of remote sensing images were used to construct an improved geological and structural map of the study area. The constructed map reveals; 1- This area is strongly controlled by NE-SW, NW-SE and N-S trending basement structures, largely faults; 2- The evolution of these distinct fault sets was largely controlled by the Red Sea tectonics which started ∼22 Ma ago. Separation of the Arabian plate from the African plate provided NE-SW extension which subsequently resulted in the development of deep NW-SE grabens (e.g. the Kubania graben) where strain was closely localized because of the presence of older NW-SE trending Precambrian structures; 3-There is a prominent pre- Nile drainage system dominated by W- and NW- drainages emerging from the uplifted Red Sea Hills prior to opening of the Red Sea; 4- Wadi Abu Subbaira

  14. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    Energy Technology Data Exchange (ETDEWEB)

    Muir Wood, R. [EQE International Ltd (United Kingdom)

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.

  15. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    International Nuclear Information System (INIS)

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can most readily be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume 'push'. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10-9/year. Within the Baltic Shield long term strain rates have been around 10-11/year, with little evidence for evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently very little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the 'Current Tectonic Regime' is of Quaternary age although the orientation of the major stress axis has remained approximately consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than

  16. A unique opportunity to reconstruct the volcanic history of the island of Nevis, Lesser Antilles

    Science.gov (United States)

    Saginor, I.; Gazel, E.

    2012-12-01

    We report twelve new ICP-MS analyses and two 40Ar/39Ar ages for the Caribbean island of Nevis, located in the Lesser Antilles. These data show a very strong fractionation trend, suggesting that along strike variations may be primarily controlled by the interaction of rising magma with the upper plate. If this fractionation trend is shown to correlate with age, it may suggest that underplating of the crust is responsible for variations in the makeup of erupted lava over time, particularly with respect to silica content. We have recently been given permission to sample a series of cores being drilled by a geothermal company with the goal of reconstructing the volcanic history of the island. Drilling is often cost-prohibitive, making this a truly unique opportunity. Nevis has received little recent attention from researchers due to the fact that it has not been active for at least 100,000 years and also because of its proximity to the highly active Montserrat, which boasts its very own volcano observatory. However, there are a number of good reasons that make this region and Nevis in particular an ideal location for further analysis. First, and most importantly, is the access to thousands of meters of drill cores that is being provided by a local geothermal company. Second, a robust earthquake catalog exists (Bengoubou-Valerius et al., 2008), so the dip and depth to the subducting slab is well known. These are fundamental parameters that influence the mechanics of a subduction zone, therefore it would be difficult to proceed if they were poorly constrained. Third, prior sampling of Nevis has been limited since Hutton and Nockolds (1978) published the only extensive petrologic study ever performed on the island. This paper contained only 43 geochemical analyses and 6 K-Ar ages, which are less reliable than more modern Ar-Ar ages. Subsequent studies tended to focus on water geochemistry (GeothermEx, 2005), geothermal potential (Geotermica Italiana, 1992; Huttrer, 1998

  17. Inverse method for reconstruction of ground surface temperature history from borehole temperatures

    Czech Academy of Sciences Publication Activity Database

    Šafanda, Jan; Correia, A.; Majorowicz, J.; Rajver, D.

    Matsuyama, 2003 - (Yamano, M.; Nagao, T.; Sweda, T.), s. 163-178 [ Geothermal /dendrochronological paleoclimate reconstruction across eastern margin of Eurasia. Matsuyama (JP), 28.11.2002-30.11.2002] R&D Projects: GA AV ČR IAA3012005 Institutional research plan: CEZ:AV0Z3012916 Keywords : borehole temperature profiles * inversion method * climate reconstruction Subject RIV: DC - Siesmology, Volcanology, Earth Structure

  18. Holocene local forest history at two sites in Småland, southern Sweden - insights from quantitative reconstructions using the Landscape Reconstruction Algorithm

    Science.gov (United States)

    Cui, Qiaoyu; Gaillard, Marie-José; Lemdahl, Geoffrey; Olsson, Fredrik; Sugita, Shinya

    2010-05-01

    Quantitative reconstruction of past vegetation using fossil pollen was long very problematic. It is well known that pollen percentages and pollen accumulation rates do not represent vegetation abundance properly because pollen values are influenced by many factors of which inter-taxonomic differences in pollen productivity and vegetation structure are the most important ones. It is also recognized that pollen assemblages from large sites (lakes or bogs) record the characteristics of the regional vegetation, while pollen assemblages from small sites record local features. Based on the theoretical understanding of the factors and mechanisms that affect pollen representation of vegetation, Sugita (2007a and b) proposed the Landscape Reconstruction Algorithm (LRA) to estimate vegetation abundance in percentage cover for well defined spatial scales. The LRA includes two models, REVEALS and LOVE. REVEALS estimates regional vegetation abundance at a spatial scale of 100 km x 100 km. LOVE estimates local vegetation abundance at the spatial scale of the relevant source area of pollen (RSAP sensu Sugita 1993) of the pollen site. REVEALS estimates are needed to apply LOVE in order to calculate the RSAP and the vegetation cover within the RSAP. The two models were validated theoretically and empirically. Two small bogs in southern Sweden were studied for pollen, plant macrofossil, charcoal, and coleoptera in order to reconstruct the local Holocene forest and fire history (e.g. Greisman and Gaillard 2009; Olsson et al. 2009). We applied the LOVE model in order to 1) compare the LOVE estimates with pollen percentages for a better understanding of the local forest history; 2) obtain more precise information on the local vegetation to explain between-sites differences in fire history. We used pollen records from two large lakes in Småland to obtain REVEALS estimates for twelve continuous 500-yrs time windows. Following the strategy of the Swedish VR LANDCLIM project (see Gaillard

  19. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2012-01-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…

  20. Bayesian deterministic decision making: A normative account of the operant matching law and heavy-tailed reward history dependency of choices

    Directory of Open Access Journals (Sweden)

    Hiroshi Saito

    2014-03-01

    Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.

  1. Reconstruction of bomb 14C time history recorded in the stalagmite from Postojna Cave

    International Nuclear Information System (INIS)

    The karstic caves provide valuable resources for reconstruction of environmental conditions on the continent in the past. This is possible due to the great stability of climatic conditions within a cave. Secondary minerals deposited in caves, known as speleothems, preserve records of long-term climatic and environmental changes at the site of their deposition and in the vicinity. The purity of speleothems and their chemical and physical stability make them exceptionally well suited for detailed geochemical and isotopic analysis

  2. Between history and cultural psychology: Some reflections on mediation and normativity when reconstructing the past

    DEFF Research Database (Denmark)

    Brescó, Ignacio

    2016-01-01

    Innis’ and Brinkmann’s papers (this issue) tackle two key aspects in cultural psychology: the mediating role played by the different systems of meanings throughout history in making sense of the world, and the normative role of those systems, including psychology itself. This paper offers a...... reflection on these two issues. It begins by highlighting the contribution of psychology and history, as emerging disciplines in the 19th Century, to the creation of a normative framework for the subject of modernity according to the needs of modern nation states. It also alludes to both disciplines’ common...... pursuit of a reference point in natural science in order to achieve a status that is on a par with the latter’s. It is argued that this resulted in an objectivist stance that equates the study of memory and history with an accurate reproduction of the past, thus concealing the mediated nature of past...

  3. Reconstructions of subducted ocean floor along the Andes: a framework for assessing Magmatic and Ore Deposit History

    Science.gov (United States)

    Sdrolias, M.; Müller, R.

    2006-05-01

    The South American-Antarctic margin has been characterised by numerous episodes of volcanic arc activity and ore deposit formation throughout much of the Mesozoic and Cenozoic. Although its Cenozoic subduction history is relatively well known, placing the Mesozoic arc-related volcanics and the emplacement of ore bodies in their plate tectonic context remains poorly constrained. We use a merged moving hotspot (Late Cretaceous- present) and palaeomagnetic /fixed hotspot (Early Cretaceous) reference frame, coupled with reconstructed spreading histories of the Pacific, Phoenix and Farallon plates to understand the convergence history of the South American and Antarctic margins. We compute the age-area distribution of oceanic lithosphere through time, including subducting oceanic lithosphere and estimate convergence rates along the margin. Additionally, we map the location and migration of spreading ridges along the margin and relate this to processes on the overriding plate. The South American-Antarctic margin in the late Jurassic-early Cretaceous was dominated by rapid convergence, the subduction of relatively young oceanic lithosphere (Rocas Verdes" in southern South America. The speed of subduction increased again along the South American-Antarctic margin at ~105 Ma after another change in tectonic regime. Newly created crust from the Farallon-Phoenix ridge continued to be subducted along southern South America until the cessation of the Farallon-Phoenix ridge in the latest Cretaceous / beginning of the Cenozoic. The age of the subducting oceanic lithosphere along the South American-Antarctic margin has increased steadily through time.

  4. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    International Nuclear Information System (INIS)

    10Be, 14C, and 26Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs

  5. Reconstruction of burial history, temperature, source rock maturity and hydrocarbon generation in the northwestern Dutch offshore

    NARCIS (Netherlands)

    Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten

    2012-01-01

    3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the

  6. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2011-04-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth centuries. This paper deals with our experiences in restaging the decomposition of water as part of a history of science course at the Norwegian University of Science and Technology, Trondheim, Norway. For the experiment we used an apparatus from our historical teaching collection and built a replica of a voltaic pile. We also traced the uses and meanings of decomposition of water within science and science teaching in schools and higher education in local institutions. Building the pile, and carrying out the experiments, held a few surprises that we did not anticipate through our study of written sources. The exercise gave us valuable insight into the nature of the devices and the experiment, and our students appreciated an experience of a different kind in a history of science course.

  7. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    Energy Technology Data Exchange (ETDEWEB)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)

    1997-09-01

    {sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.

  8. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  9. Bayesian statistics

    OpenAIRE

    Draper, D.

    2001-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  10. Cosmology with hybrid expansion law: scalar field reconstruction of cosmic history and observational constraints

    International Nuclear Information System (INIS)

    In this paper, we consider a simple form of expansion history of Universe referred to as the hybrid expansion law - a product of power-law and exponential type of functions. The ansatz by construction mimics the power-law and de Sitter cosmologies as special cases but also provides an elegant description of the transition from deceleration to cosmic acceleration. We point out the Brans-Dicke realization of the cosmic history under consideration. We construct potentials for quintessence, phantom and tachyon fields, which can give rise to the hybrid expansion law in general relativity. We investigate observational constraints on the model with hybrid expansion law applied to late time acceleration as well as to early Universe a la nucleosynthesis

  11. Merged or monolithic? Using machine-learning to reconstruct the dynamical history of simulated star clusters

    OpenAIRE

    Pasquato, Mario; Chung, Chul

    2016-01-01

    Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We ...

  12. Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances

    CERN Document Server

    Snaith, O; Di Matteo, P; Lehnert, M D; Combes, F; Katz, D; Gómez, A

    2014-01-01

    We develop a chemical evolution model in order to study the star formation history of the Milky Way. Our model assumes that the Milky Way is formed from a closed box-like system in the inner regions, while the outer parts of the disc experience some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) in order to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age in order to recover the star formation history of the Galaxy. Our method enables one to recover with unprecedented accuracy the star formation history of the Milky Way in the first Gyrs, in both the inner (R9-10kpc) discs as sampled in the solar vicinity. We show that, in the inner disc, half of the stellar mass formed during the thick disc phase, in the first 4-5 Gyr. This phase was followed by a significant dip in the star formation activity (at 8-9 Gyr) and a period of roughly constant lower level star formation for the remaining 8 Gyr. The thick disc phase ha...

  13. Probing the Expansion history of the Universe by Model-Independent Reconstruction from Supernovae and Gamma-Ray Bursts Measurements

    CERN Document Server

    Feng, Chao-Jun

    2016-01-01

    To probe the late evolution history of the Universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis (PCA) and the other is build by taking the multidimensional scaling (MDS) approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis are optimized for different kinds of cosmological models that based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that projected from the basis systems is cosmological model independent, and it provide a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray bursts (GRBs) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt (LM) technique and the Markov Chain Monte Carlo (MCMC) method, we perform an ...

  14. Reconstructing the Solar Wind From Its Early History To Current Epoch

    CERN Document Server

    Airapetian, Vladimir S

    2016-01-01

    Stellar winds from active solar type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass loss rates cannot be directly derived from observations. We employed a three dimensional magnetohydrodynamic Alfven wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass loss rate, terminal velocity and wind temperature at 0.7, 2 and 4.65 Gyr. Our model treats the wind thermal electrons, protons and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfven wave amplitude and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constra...

  15. Demographic History of the Genus Pan Inferred from Whole Mitochondrial Genome Reconstructions

    Science.gov (United States)

    Tucci, Serena; de Manuel, Marc; Ghirotto, Silvia; Benazzo, Andrea; Prado-Martinez, Javier; Lorente-Galdos, Belen; Nam, Kiwoong; Dabad, Marc; Hernandez-Rodriguez, Jessica; Comas, David; Navarro, Arcadi; Schierup, Mikkel H.; Andres, Aida M.; Barbujani, Guido; Hvilsom, Christina; Marques-Bonet, Tomas

    2016-01-01

    The genus Pan is the closest genus to our own and it includes two species, Pan paniscus (bonobos) and Pan troglodytes (chimpanzees). The later is constituted by four subspecies, all highly endangered. The study of the Pan genera has been incessantly complicated by the intricate relationship among subspecies and the statistical limitations imposed by the reduced number of samples or genomic markers analyzed. Here, we present a new method to reconstruct complete mitochondrial genomes (mitogenomes) from whole genome shotgun (WGS) datasets, mtArchitect, showing that its reconstructions are highly accurate and consistent with long-range PCR mitogenomes. We used this approach to build the mitochondrial genomes of 20 newly sequenced samples which, together with available genomes, allowed us to analyze the hitherto most complete Pan mitochondrial genome dataset including 156 chimpanzee and 44 bonobo individuals, with a proportional contribution from all chimpanzee subspecies. We estimated the separation time between chimpanzees and bonobos around 1.15 million years ago (Mya) [0.81–1.49]. Further, we found that under the most probable genealogical model the two clades of chimpanzees, Western + Nigeria-Cameroon and Central + Eastern, separated at 0.59 Mya [0.41–0.78] with further internal separations at 0.32 Mya [0.22–0.43] and 0.16 Mya [0.17–0.34], respectively. Finally, for a subset of our samples, we compared nuclear versus mitochondrial genomes and we found that chimpanzee subspecies have different patterns of nuclear and mitochondrial diversity, which could be a result of either processes affecting the mitochondrial genome, such as hitchhiking or background selection, or a result of population dynamics. PMID:27345955

  16. Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances

    Science.gov (United States)

    Snaith, O.; Haywood, M.; Di Matteo, P.; Lehnert, M. D.; Combes, F.; Katz, D.; Gómez, A.

    2015-06-01

    We develop a chemical evolution model to study the star formation history of the Milky Way. Our model assumes that the Milky Way has formed from a closed-box-like system in the inner regions, while the outer parts of the disc have experienced some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age to recover the star formation history of the Galaxy. Our method enables us to recover the star formation history of the Milky Way in the first Gyrs with unprecedented accuracy in the inner (R 9-10 kpc) discs, as sampled in the solar vicinity. We show that half the stellar mass formed during the thick-disc phase in the inner galaxy during the first 4-5 Gyr. This phase was followed by a significant dip in star formation activity (at 8-9 Gyr) and a period of roughly constant lower-level star formation for the remaining 8 Gyr. The thick-disc phase has produced as many metals in 4 Gyr as the thin-disc phase in the remaining 8 Gyr. Our results suggest that a closed-box model is able to fit all the available constraints in the inner disc. A closed-box system is qualitatively equivalent to a regime where the accretion rate maintains a high gas fraction in the inner disc at high redshift. In these conditions the SFR is mainly governed by the high turbulence of the interstellar medium. By z ~ 1 it is possible that most of the accretion takes place in the outer disc, while the star formation activity in the inner disc is mostly sustained by the gas that is not consumed during the thick-disc phase and the continuous ejecta from earlier generations of stars. The outer disc follows a star formation history very similar to that of the inner disc, although initiated at z ~ 2, about 2 Gyr before the onset of the thin-disc formation in the inner disc.

  17. A Blacker and Browner Shade of Pale: Reconstructing Punk Rock History

    OpenAIRE

    Pietschmann, Franziska

    2010-01-01

    Embedded in the transatlantic history of rock ‘n’ roll, punk rock has not only been regarded as a watershed moment in terms of music, aesthetics and music-related cultural practices, it has also been perceived as a subversive white cultural phenomenon. A Blacker and Browner Shade of Pale challenges this widespread and shortsighted assumption. People of color, particularly black Americans and Britons, and Latina/os have pro-actively contributed to punk’s evolution and shaped punk music culture...

  18. Reconstructing and analyzing China's fifty-nine year (1951–2009 drought history using hydrological model simulation

    Directory of Open Access Journals (Sweden)

    Z. Y. Wu

    2011-09-01

    Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North

  19. Testing sex and gender in sports; reinventing, reimagining and reconstructing histories.

    Science.gov (United States)

    Heggie, Vanessa

    2010-12-01

    Most international sports organisations work on the premise that human beings come in one of two genders: male or female. Consequently, all athletes, including intersex and transgender individuals, must be assigned to compete in one or other category. Since the 1930s (not, as is popularly suggested, the 1960s) these organisations have relied on scientific and medical professionals to provide an 'objective' judgement of an athlete's eligibility to compete in women's national and international sporting events. The changing nature of these judgements reflects a great deal about our cultural, social and national prejudices, while the matter of testing itself has become a site of conflict for feminists and human rights activists. Because of the sensitive nature of this subject, histories of sex testing are difficult to write and research; this has lead to the repetition of inaccurate information and false assertions about gender fraud, particularly in relation to the 'classic' cases of Stella Walsh and Heinrich/Hermann/Dora Ratjen. As historians, we need to be extremely careful to differentiate between mythologies and histories. PMID:20980057

  20. Reconstructing Jewish Identity on the Foundations of Hellenistic History: Azariah de' Rossi's Me'or 'Enayim in Late 16th Century Northern Italy

    OpenAIRE

    Rosenberg-Wohl, David Michael

    2014-01-01

    AbstractReconstructing Jewish Identity on the Foundations of Hellenistic History:Azariah de' Rossi's Me'or `Enayim in Late 16th Century Northern ItalybyDavid Michael Rosenberg-WohlDoctor of Philosophy in Jewish Studiesandthe Graduate Theological UnionProfessor Erich S. Gruen, ChairMe'or `Enayim is conventionally considered to be early modern Jewish history. Recent scholarship tends to consider the work Renaissance historiography, Counter-Reformation apology or some combination of the two. The...

  1. Reconstructing the history of 14C discharges from Sellafield: Part 1--atmospheric discharges

    International Nuclear Information System (INIS)

    14C specific activities, above ambient background levels, were determined in individual tree-rings (corresponding to the years 1950-1999) sectioned from an oak tree that was felled in autumn 1999, from a location 1.5 km east of the Sellafield nuclear fuel reprocessing plant in Cumbria, north-west England. The data were used to produce a new, improved, reconstruction of Sellafield's annual atmospheric 14C discharges between 1951 and 1999, using the most reliable discharge data set (1994-1999) as the primary basis for the determination of a new calibration factor that relates excess 14C activity in individual tree rings to the annual discharge during the corresponding year. The results indicate that the current British Nuclear Fuels plc (BNFL) estimate of total 14C discharges to the atmosphere prior to 1978 is significantly overestimated, while the current estimate of total 14C discharges after 1978 is very similar to that determined in this study. In this study, the total activity of 14C discharged to the atmosphere from Sellafield between 1951 and 1999 is estimated to be 259±63 TBq (at 2 std. dev.). The BNFL current estimate is 360 TBq

  2. A complete ancient RNA genome: identification, reconstruction and evolutionary history of archaeological Barley Stripe Mosaic Virus.

    Science.gov (United States)

    Smith, Oliver; Clapham, Alan; Rose, Pam; Liu, Yuan; Wang, Jun; Allaby, Robin G

    2014-01-01

    The origins of many plant diseases appear to be recent and associated with the rise of domestication, the spread of agriculture or recent global movements of crops. Distinguishing between these possibilities is problematic because of the difficulty of determining rates of molecular evolution over short time frames. Heterochronous approaches using recent and historical samples show that plant viruses exhibit highly variable and often rapid rates of molecular evolution. The accuracy of estimated evolution rates and age of origin can be greatly improved with the inclusion of older molecular data from archaeological material. Here we present the first reconstruction of an archaeological RNA genome, which is of Barley Stripe Mosaic Virus (BSMV) isolated from barley grain ~750 years of age. Phylogenetic analysis of BSMV that includes this genome indicates the divergence of BSMV and its closest relative prior to this time, most likely around 2000 years ago. However, exclusion of the archaeological data results in an apparently much more recent origin of the virus that postdates even the archaeological sample. We conclude that this viral lineage originated in the Near East or North Africa, and spread to North America and East Asia with their hosts along historical trade routes. PMID:24499968

  3. Regional reconstruction of flash flood history in the Guadarrama range (Central System, Spain).

    Science.gov (United States)

    Rodriguez-Morata, C; Ballesteros-Cánovas, J A; Trappmann, D; Beniston, M; Stoffel, M

    2016-04-15

    Flash floods are a common natural hazard in Mediterranean mountain environments and responsible for serious economic and human disasters. The study of flash flood dynamics and their triggers is a key issue; however, the retrieval of historical data is often limited in mountain regions as a result of short time series and the systematic lack of historical data. In this study, we attempt to overcome data deficiency by supplementing existing records with dendrogeomorphic techniques which were employed in seven mountain streams along the northern slopes of the Guadarrama Mountain range. Here we present results derived from the tree-ring analysis of 117 samples from 63 Pinus sylvestris L. trees injured by flash floods, to complement existing flash flood records covering the last ~200years and comment on their hydro-meteorological triggers. To understand the varying number of reconstructed flash flood events in each of the catchments, we also performed a comparative analysis of geomorphic catchment characteristics, land use evolution and forest management. Furthermore, we discuss the limitations of dendrogeomorphic techniques applied in managed forests. PMID:26845178

  4. Reconstruction of Disturbance History in Naples Bay, Florida: A Combined Radiometric/Geochemical Approach

    Science.gov (United States)

    van Eaton, A. R.; Zimmerman, A.; Brenner, M.; Kenney, W.; Jaeger, J. M.

    2006-12-01

    Historical reconstructions of aquatic systems have commonly depended on short-lived radioisotopes (e.g. Pb- 210 and Cs-137) to provide a temporal framework for disturbances over the past 100 years. However, applications of these radiotracers to highly variable systems such as estuaries are often problematic. Hydrologic systems prone to rapid shifts in sediment composition and grain size distribution may yield low and erratic isotopic activities with depth in sediment. Additionally, the marine influence on coastal systems and preferential adsorption of radionuclides by organic matter may violate assumptions of the CIC and CRS dating models. Whereas these sediment cores are often deemed "undateable", we propose a modeling technique that accounts for textural and compositional variation, providing insight into the depositional patterns and disturbance records of these dynamic environments. Here, the technique is applied to sediment cores collected from five regions of Naples Bay estuary in southwest Florida. The significant positive correlation between excess Pb-210 activities and organic matter content in each core provides evidence for strong lithologic control on radioisotope scavenging, supporting the use of organic matter- normalized excess Pb-210 activity profiles when modeling sediment accumulation rates in predominantly sandy estuaries. Using this approach, episodes of increased sedimentation rate were established that correspond to periods of heightened anthropogenic disturbance (canal dredging and development) in the Naples Bay watershed during the mid- 1900's.

  5. Bayesian methods to restore and re build images: application to gamma-graphy and to photofission tomography; Methodes bayesiennes pour la restauration et la reconstruction d`images application a la gammagraphie et a la tomographie par photofissions

    Energy Technology Data Exchange (ETDEWEB)

    Stawinski, G

    1998-10-26

    Bayesian algorithms are developed to solve inverse problems in gamma imaging and photofission tomography. The first part of this work is devoted to the modeling of our measurement systems. Two models have been found for both applications: the first one is a simple conventional model and the second one is a cascaded point process model. EM and MCMC Bayesian algorithms for image restoration and image reconstruction have been developed for these models and compared. The cascaded point process model does not improve significantly the results previously obtained by the classical model. To original approaches have been proposed, which increase the results previously obtained. The first approach uses an inhomogeneous Markov Random Field as a prior law, and makes the regularization parameter spatially vary. However, the problem of the estimation of hyper-parameters has not been solved. In the case of the deconvolution of point sources, a second approach has been proposed, which introduces a high level prior model. The picture is modeled as a list of objects, whose parameters and number are unknown. The results obtained with this method are more accurate than those obtained with the conventional Markov Random Field prior model and require less computational costs. (author)

  6. How to make judicious use of current physics in reconstructing its history

    Science.gov (United States)

    Janssen, Michel

    2013-04-01

    Using three concrete examples, I illustrate both benefits and pitfalls of approaching the history of relativity and quantum theory with current textbook knowledge of these subjects. First, I show how knowing something about energy-momentum tensors in special relativity makes it easy to see that special relativity did not, as has been suggested, simply kill the program of Abraham and others at the beginning of the 20th century to reduce all of physics to electrodynamics, but co-opted key elements of it. Second, I show how knowing something about coordinate conditions in general relativity can be an obstacle to seeing why Einstein initially rejected field equations based on the Ricci tensor. Third, I show how knowing something about Hilbert space can be an obstacle to seeing the logic behind Jordan's statistical transformation theory. These three examples suggest that knowledge of modern physics is beneficial for historians, but only when used judiciously.

  7. Merged or monolithic? Using machine-learning to reconstruct the dynamical history of simulated star clusters

    CERN Document Server

    Pasquato, Mario

    2016-01-01

    Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After dimensionality reduction on the feature space, the resulting datapoints are fed to various classification algorithms. Using repeated random subsampling validation we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results. The three algorithms we considered (C5.0 tree...

  8. Reconstructing the Life Histories of Spanish Primary School Teachers: A Novel Approach for the Study of the Teaching Profession and School Culture

    Science.gov (United States)

    Mahamud, Kira; Martínez Ruiz-Funes, María José

    2014-01-01

    This paper describes a study dealing with the reconstruction of the lives of two Spanish primary school teachers during the Franco dictatorship (1939-1975), in order to learn to what extent such a field of research can contribute to the history of education. Two family archives provide extraordinary and unique documentation to track down their…

  9. A Reconstruction of Development of the Periodic Table Based on History and Philosophy of Science and Its Implications for General Chemistry Textbooks

    Science.gov (United States)

    Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor

    2005-01-01

    The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…

  10. Accelerating fDOT image reconstruction based on path-history fluorescence Monte Carlo model by using three-level parallel architecture.

    Science.gov (United States)

    Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Luo, Qingming

    2015-10-01

    The excessive time required by fluorescence diffuse optical tomography (fDOT) image reconstruction based on path-history fluorescence Monte Carlo model is its primary limiting factor. Herein, we present a method that accelerates fDOT image reconstruction. We employ three-level parallel architecture including multiple nodes in cluster, multiple cores in central processing unit (CPU), and multiple streaming multiprocessors in graphics processing unit (GPU). Different GPU memories are selectively used, the data-writing time is effectively eliminated, and the data transport per iteration is minimized. Simulation experiments demonstrated that this method can utilize general-purpose computing platforms to efficiently implement and accelerate fDOT image reconstruction, thus providing a practical means of using path-history-based fluorescence Monte Carlo model for fDOT imaging. PMID:26480115

  11. Reconstruction of the paleothermal history of the sub-basin, Lower Guajira, Colombia

    International Nuclear Information System (INIS)

    The paleothermal history of the lower Guajira sub-basin was achieved using three methods: 1) calculation of the presents geothermal gradient and heat flow, 2) vitrinite reflectance and, 3) fission track analysis on apatites and zircon grains. New analytical data of vitrinite reflectance and fission tracks allowed identifying four thermal events with the following features: the oldest thermal event took place during the late cretaceous between 95 and 65 ma. The second thermal event occurred during the late Eocene between 40 and 35 ma. The third thermal event occurred in the early and middle Miocene between 22 and 15 ma. The fourth and last thermal event took place in the late Miocene between 10 and 5 ma. The cooling events match with four unconformities previously identified: 1) the late cretaceous unconformity at the top of the Guaralamai Formation, 2) the late Eocene unconformity at the base of the Siamana Formation 3) the early to middle Miocene unconformity on the top of the Siamana Formation, and 4) the late Miocene unconformity on the top of the Castilletes Formation.

  12. Merged or monolithic? Using machine-learning to reconstruct the dynamical history of simulated star clusters

    Science.gov (United States)

    Pasquato, Mario; Chung, Chul

    2016-05-01

    Context. Machine-learning (ML) solves problems by learning patterns from data with limited or no human guidance. In astronomy, ML is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims: We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify globular clusters (GCs) that may have a history of merging from observational data. Methods: We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After carrying out dimensionality reduction on the feature space, the resulting datapoints are fed in to various classification algorithms. Using repeated random subsampling validation, we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results: The three algorithms we considered (C5.0 trees, k-nearest neighbour, and support-vector machines) all achieve a test misclassification rate of about 10% without parameter tuning, with support-vector machines slightly outperforming the others. The first principal component of feature space correlates with cluster concentration. If we exclude it from the regression, the performance of the algorithms is only slightly reduced.

  13. The reconstructive study in arcaheology: case histories in the communication issues

    Directory of Open Access Journals (Sweden)

    Francesco Gabellone

    2011-09-01

    Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l

  14. Probing the Expansion History of the Universe by Model-independent Reconstruction from Supernovae and Gamma-Ray Burst Measurements

    Science.gov (United States)

    Feng, Chao-Jun; Li, Xin-Zhou

    2016-04-01

    To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.

  15. Late Quaternary vegetation, fire and climate history reconstructed from two cores at Cerro Toledo, Podocarpus National Park, southeastern Ecuadorian Andes

    Science.gov (United States)

    Brunschön, Corinna; Behling, Hermann

    2009-11-01

    The last ca. 20,000 yr of palaeoenvironmental conditions in Podocarpus National Park in the southeastern Ecuadorian Andes have been reconstructed from two pollen records from Cerro Toledo (04°22'28.6"S, 79°06'41.5"W) at 3150 m and 3110 m elevation. Páramo vegetation with high proportions of Plantago rigida characterised the last glacial maximum (LGM), reflecting cold and wet conditions. The upper forest line was at markedly lower elevations than present. After ca. 16,200 cal yr BP, páramo vegetation decreased slightly while mountain rainforest developed, suggesting rising temperatures. The trend of increasing temperatures and mountain rainforest expansion continued until ca. 8500 cal yr BP, while highest temperatures probably occurred from 9300 to 8500 cal yr BP. From ca. 8500 cal yr BP, páramo vegetation re-expanded with dominance of Poaceae, suggesting a change to cooler conditions. During the late Holocene after ca. 1800 cal yr BP, a decrease in páramo indicates a change to warmer conditions. Anthropogenic impact near the study site is indicated for times after 2300 cal yr BP. The regional environmental history indicates that through time the eastern Andean Cordillera in South Ecuador was influenced by eastern Amazonian climates rather than western Pacific climates.

  16. Bayesian Monitoring.

    OpenAIRE

    Kirstein, Roland

    2005-01-01

    This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...

  17. The Holocene environmental history of the Verkhoyansk Mountains region (northeastern Siberia, Russia) reconstructed from high-resolution pollen data

    Science.gov (United States)

    Müller, S.; Tarasov, P. E.; Andreev, A. A.; Diekmann, B.

    2009-04-01

    The study presented here is part of the IPY project 106 "Lake Records of late Quaternary Climate Variability in northeastern Siberia" and the German Research Foundation project RI 809/17-1,2 "Late Quaternary environmental history of interstadial and interglacial periods in the Arctic reconstructed from bioindicators in permafrost sequences in NE Siberia". Both projects focus on generating high-resolution vegetation and climate proxy records mainly from lacustrine sediments along a north-south transect from Yakutia, Republic of Russia. This region is known for its climate extremes, with the Verkhoyansk Mountain Range being the coldest area in the Northern Hemisphere - "Pole of Cold". Radiocarbon-dated pollen records from Lake Billyakh (65°17'N, 126°47'E; 340 m a.s.l.) located in the central part of the Verkhoyansk Mountains were used to reconstruct vegetation and climate changes. The longest and oldest sediment core from the lake reaches back to >30 kyr BP, thus covering the last two Late Pleistocene Interstadials in Siberia. The pollen record and pollen-based biome reconstruction of the core PG 1756, which covers the last 15 kyr BP, suggest that open cool steppe and grass and sedge tundra communities with Poaceae, Cyperaceae, Artemisia, Chenopodiaceae, Caryophyllaceae and Selaginella rupestris dominated the area from 15 to 13.5 kyr BP. On the other hand, the constant presence of Larix pollen in quantities comparable to today's values points to the constant presence of boreal deciduous conifer trees in the regional vegetation during the last glaciation. A major spread of shrub tundra communities, including birch (Betula sect. Nanae), alder (Duschekia fruticosa) and willow (Salix) species, is dated to 13.5-12.7 kyr BP, indicating a noticeable increase in precipitation toward the end of the last glaciation, particularly during the Allerød Interstadial. Between 12.7 and 11.4 kyr BP pollen percentages of herbaceous taxa rapidly increased, whereas shrub taxa

  18. A new method to reconstruct hydrocarbon-generating histories of source rocks in a petroleum-bearing basin the method of geological and geochemical sections

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Via investigating typical Palaeozoic and Mesozoic petroleum-bearing basins in China by using thermal maturation theories of organic matter to improve the conventional Karweil's method,a new method to reconstruct hydrocarbon-generating histories of source rocks has been suggested.This method, combining geological background with geochemical information makes the calculated VRo closer to the measured one. Moreover, it enables us to make clear the hydrocarbon generation trend of source rocks during geological history. The method has the merits of simple calculation and objective presentation, especially suitable to basins whose sedimentation and tectonic movements are complicated.

  19. Reconstructing land use history from Landsat time-series. Case study of a swidden agriculture system in Brazil

    Science.gov (United States)

    Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert

    2016-05-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified

  20. Characterizing and Communicating Risk with Exposure Reconstruction and Bayesian Analysis: Historical Locomotive Maintenance/Repair Associated with Asbestos Woven Tape Pipe Lagging.

    Science.gov (United States)

    Boelter, Fred W; Persky, Jacob D; Podraza, Daniel M; Bullock, William H

    2016-02-01

    Our reconstructed historical work scenarios incorporating a vintage 1950s locomotive can assist in better understanding the historical asbestos exposures associated with past maintenance and repairs and fill a literature data gap. Air sampling data collected during the exposure scenarios and analyzed by NIOSH 7400 (PCM) and 7402 (PCME) methodologies show personal breathing zone asbestiform fiber exposures were below the current OSHA exposure limits for the eight-hour TWA permissible exposure limit (PEL) of 0.1 f/cc (range woven tape lagging that may have been chrysotile asbestos and handled, removed, and reinstalled during repair and maintenance activities. We reconstructed historical work scenarios containing asbestos woven tape pipe lagging that have not been characterized in the published literature. The historical work scenarios were conducted by a retired railroad pipefitter with 37 years of experience working with materials and locomotives. PMID:26255644

  1. Paleosol charcoal : Reconstructing vegetation history in relation to agro−pastoral activities since the Neolithic. A case study in the Eastern French Pyrenees.

    OpenAIRE

    Bal, Marie; Bal, Marie-Claude; Rendu, Christine; Ruas, Marie-Pierre; Campmajo, Pierre

    2010-01-01

    International audience This article uses a method that combines pedoanthracological and pedo-archaeological approaches to terraces, complemented with archaeological pastoral data, in order to reconstruct the history of ancient agricultural terraces on a slope of the Enveitg Mountain in the French Pyrenees. Four excavations revealed two stages of terrace construction that have been linked with vegetation dynamics, which had been established by analyses of charcoal from the paleosols and soi...

  2. Extracting information from previous full-dose CT scan for knowledge-based Bayesian reconstruction of current low-dose CT images

    OpenAIRE

    Zhang, Hao; Han, Hao; Liang, Zhengrong; Hu, Yifan; Liu, Yan; Moore, William; Ma, Jianhua; Lu, Hongbing

    2015-01-01

    Markov random field (MRF) model has been widely employed in edge-preserving regional noise smoothing penalty to reconstruct piece-wise smooth images in the presence of noise, such as in low-dose computed tomography (LdCT). While it preserves edge sharpness, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it may compromise clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodules or colo...

  3. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  4. Bayesian methods outperform parsimony but at the expense of precision in the estimation of phylogeny from discrete morphological data.

    Science.gov (United States)

    O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J

    2016-04-01

    Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. PMID:27095266

  5. Character State Reconstruction of Call Diversity in the Neoconocephalus Katydids Reveals High Levels of Convergence

    OpenAIRE

    Frederick, Katy; Schul, Johannes

    2016-01-01

    The katydid genus Neoconocephalus is characterized by high diversity of the acoustic communication system. Both male signals and female preferences have been thoroughly studied in the past. This study used Bayesian character state reconstruction to elucidate the evolutionary history of diverse call traits, based on an existing, well supported phylogenetic hypothesis. The most common male call pattern consisted of continuous calls comprising one fast pulse rate; this pattern is the likely ance...

  6. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  7. Numts help to reconstruct the demographic history of the ocellated lizard (Lacerta lepida) in a secondary contact zone

    DEFF Research Database (Denmark)

    Miraldo, Andreia; Hewitt, Godfrey M; Dear, Paul H; Paulo, Octavio S; Emerson, Brent C

    2012-01-01

    numts in L3 after secondary contact occurred prior to, or coincident with, the northward expansion of L3. This study shows that, in the context of phylogeographic analysis, numts can provide evidence for past demographic events and can be useful tools for the reconstruction of complex evolutionary...

  8. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  9. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  10. 210Pb-derived ages for the reconstruction of terrestrial contaminant history into the Mexican Pacific coast: Potential and limitations

    International Nuclear Information System (INIS)

    210Pb is widely used for dating recent sediments in the aquatic environment; however, our experiences working in shallow coastal environments in the Pacific coast of Mexico have demonstrated that the potential of 210Pb for reliable historical reconstructions might be limited by the low 210Pb atmospheric fallout, sediment mixing, abundance of coarse sediments and the lack of 137Cs signal for 210Pb corroboration. This work discusses the difficulties in obtaining adequate sedimentary records for geochronological reconstruction in such active and complex settings, including examples of 210Pb geochronologies based on sediment profiles collected in two contrasting areas coastal areas (mudflats associated to coastal lagoons of Sinaloa State and the continental shelf of the Gulf of Tehuantepec), in which geochemical data was used to support the temporal frame established and the changes in sediment supply recorded in the sediment cores which were related to the development of land-based activities during the last century.

  11. Reconstructing the Phylogenetic History of Long-Term Effective Population Size and Life-History Traits Using Patterns of Amino Acid Replacement in Mitochondrial Genomes of Mammals and Birds

    Science.gov (United States)

    Nabholz, Benoit; Lartillot, Nicolas

    2013-01-01

    The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more

  12. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  13. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  14. 3D Surface Reconstruction and Automatic Camera Calibration

    Science.gov (United States)

    Jalobeanu, Andre

    2004-01-01

    Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.

  15. Bayesian Games with Intentions

    OpenAIRE

    Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael

    2016-01-01

    We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  16. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied on...

  17. Sclerochronology - a highly versatile tool for mariculture and reconstruction of life history traits of the queen conch, textit{Strombus gigas} (Gastropoda)

    Science.gov (United States)

    Radermacher, Pascal; Schöne, Bernd R.; Gischler, Eberhard; Oschmann, Wolfgang; Thébault, Julien; Fiebig, Jens

    2010-05-01

    The shell of the queen conch Strombus gigas provides a rapidly growing palaeoenvironmental proxy archive, allowing the detailed reconstruction of important life-history traits such as ontogeny, growth rate and growth seasonality. In this study, modern sclerochronological methods are used to cross-date the palaeotemperatures derived from the shell with local sea surface temperature (SST) records. The growth history of the shell suggests a bimodal seasonality in growth, with the growing season confined to the interval between April and November. In Glovers Reef, offshore Belize, the queen conch accreted shell carbonate at rates of up to 6 mm day-1 during the spring (April-June) and autumn (September-November). However a reduced period of growth occurred during the mid-summer months (July-August). The shell growth patterns indicate a positive response to annual seasonality with regards to precipitation. It seems likely that when precipitation levels are high, food availability is increased as the result of nutrient input to the ecosystem in correspondence with an increase in coastal runoff. Slow growth rates occur when precipitation, and as a consequence riverine runoff, is low. The SST however appears to influence growth only on a secondary level. Despite the bimodal growing season and the winter cessation in growth, the growth rates reconstructed here from two S. gigas shells are among the fastest yet reported for this species. The S. gigas specimens from Belize reached their final shell height (of 22.7 and 23.5 cm in distance between the apex and the siphonal notch) at the transition to adulthood in just 2 years. The extremely rapid growth as observed in this species permits detailed, high-resolution reconstructions of life-history traits where sub-daily resolutions can be achieved with ease. The potential for future studies has yet to be further explored. Queen conch sclerochronology provides an opportunity to recover extremely high-resolution palaeotemperature

  18. (No) Limits to Anglo-American Accounting? Reconstructing the History of the International Accounting Standards Committee ; A Review Article

    OpenAIRE

    Botzem, S.; S Quack

    2009-01-01

    The development of the current International Accounting Standards Board (IASB) from the earlier International Accounting Standards Committee (IASC) provides insight into many issues of international financial reporting, among them the characteristics of international accounting standards themselves. This article reviews Camfferman and Zeff’s [Camfferman, K., & Zeff, S. A. (2007). Financial reporting and global capital markets. A history of the international accounting standards committee 1973...

  19. Impact of Quaternary climatic changes and interspecific competition on the demographic history of a highly mobile generalist carnivore, the coyote

    OpenAIRE

    Koblmüller, S; Robert K Wayne; Leonard, Jennifer A.

    2012-01-01

    Recurrent cycles of climatic change during the Quaternary period have dramatically affected the population genetic structure of many species. We reconstruct the recent demographic history of the coyote (Canis latrans) through the use of Bayesian techniques to examine the effects of Late Quaternary climatic perturbations on the genetic structure of a highly mobile generalist species. Our analysis reveals a lack of phylogeographic structure throughout the range but past population size changes ...

  20. The importance of natural history and research collections to environmental reconstruction and remediation, and the establishment of shifting baselines

    Science.gov (United States)

    Roopnarine, P. D.; Anderson, L.; Roopnarine, D.; Gillikin, D. P.; Leal, J.

    2012-12-01

    The Earth's environments are changing more rapidly today than at almost any time in the Phanerozoic. These changes are driven by human activities, and include climate change, landscape alteration, fragmentation and destruction, environmental pollution, species overexploitation, and invasive species. The rapidity of the changes challenges our best efforts to document what is changing, how it has changed, and what has been lost. Central to these efforts, therefore, is the proper documentation, archiving and curation of past environments. Natural history and other research collections form the core of this documentation, and have proven vital to recent studies of environmental change. Those collections are, however, generally under-utilized and under-appreciated by the general research community. Also, their utility is hampered by insufficient availability of the data, and the very nature of what has been collected in the past. Past collections emphasized a typological approach, placing emphasis on individual specimens and diversity, whether geological or biological, while what is needed today is greater emphasis on archiving entire environments. The concept of shifting baselines establishes that even on historical time scales, the notion of what constitutes an unaltered environment is biased by a lack of documentation and understanding of environments in the recent past. Baselines are necessary, however, for the proper implementation of mitigating procedures, for environmental restoration or remediation, and for predicting the near-term future. Here we present results from a study of impacts of the Deepwater Horizon oil spill (DWH) on the American oyster Crassostrea virginica. Natural history collections of specimens from the Gulf and elsewhere have been crucial to this effort, and serve as an example of how important such collections are to current events. We are examining the effects of spill exposure on shell growth and tissue development, as well as the potential

  1. Reconstructing the sedimentation history of the Bengal Delta Plain by means of geochemical and stable isotopic data

    International Nuclear Information System (INIS)

    The purpose of this study is to examine the sedimentation history of the central floodplain area of the Bengal Delta Plain in West Bengal, India. Sediments from two boreholes were analyzed regarding lithology, geochemistry and the stable isotopic composition of embedded organic matter. Different lithofacies were distinguished that reflect frequent changes in the prevailing sedimentary depositional environment of the study area. The lowest facies comprises poorly sorted fluvial sediments composed of fine gravel to clay pointing at high transport energy and intense relocation processes. This facies is considered to belong to an early Holocene lowstand systems tract that followed the last glacial maximum. Fine to medium sands above it mark a gradual change towards a transgressive systems tract. Upwards increasing proportions of silt and the stable isotopic composition of embedded organic matter both indicate a gradual change from fluvial channel infill sediments towards more estuarine and marine influenced deposits. Youngest sediments are composed of clayey and silty overbank deposits of the Hooghly River that have formed a vast low-relief delta-floodplain. Close to the surface, small concretions of secondary Mn-oxides and Fe-(oxyhydr)oxides occur and mark the fluctuation range of the unsaturated zone. These concretions are accompanied by relatively high contents of trace elements such as Zn, Ni, Cu, and As. To sum up, the outcomes of this study provide new insights into the complex sedimentation history of the barely investigated central floodplain area of West Bengal

  2. Reconstructing lake evaporation history and the isotopic composition of precipitation by a coupled δ18O-δ2H biomarker approach

    Science.gov (United States)

    Hepp, Johannes; Tuthorn, Mario; Zech, Roland; Mügler, Ines; Schlütz, Frank; Zech, Wolfgang; Zech, Michael

    2015-10-01

    Over the past decades, δ18O and δ2H analyses of lacustrine sediments became an invaluable tool in paleohydrology and paleolimnology for reconstructing the isotopic composition of past lake water and precipitation. However, based on δ18O or δ2H records alone, it can be challenging to distinguish between changes of the precipitation signal and changes caused by evaporation. Here we propose a coupled δ18O-δ2H biomarker approach that provides the possibility to disentangle between these two factors. The isotopic composition of long chain n-alkanes (n-C25, n-C27, n-C29, n-C31) were analyzed in order to establish a 16 ka Late Glacial and Holocene δ2H record for the sediment archive of Lake Panch Pokhari in High Himalaya, Nepal. The δ2Hn-alkane record generally corroborates a previously established δ18Osugar record reporting on high values characterizing the deglaciation and the Older and the Younger Dryas, and low values characterizing the Bølling and the Allerød periods. Since the investigated n-alkane and sugar biomarkers are considered to be primarily of aquatic origin, they were used to reconstruct the isotopic composition of lake water. The reconstructed deuterium excess of lake water ranges from +57‰ to -85‰ and is shown to serve as proxy for the evaporation history of Lake Panch Pokhari. Lake desiccation during the deglaciation, the Older Dryas and the Younger Dryas is affirmed by a multi-proxy approach using the Hydrogen Index (HI) and the carbon to nitrogen ratio (C/N) as additional proxies for lake sediment organic matter mineralization. Furthermore, the coupled δ18O and δ2H approach allows disentangling the lake water isotopic enrichment from variations of the isotopic composition of precipitation. The reconstructed 16 ka δ18Oprecipitation record of Lake Panch Pokhari is well in agreement with the δ18O records of Chinese speleothems and presumably reflects the Indian Summer Monsoon variability.

  3. The Bayesian Bootstrap

    OpenAIRE

    Rubin, Donald B.

    1981-01-01

    The Bayesian bootstrap is the Bayesian analogue of the bootstrap. Instead of simulating the sampling distribution of a statistic estimating a parameter, the Bayesian bootstrap simulates the posterior distribution of the parameter; operationally and inferentially the methods are quite similar. Because both methods of drawing inferences are based on somewhat peculiar model assumptions and the resulting inferences are generally sensitive to these assumptions, neither method should be applied wit...

  4. Combining livestock production information in a process-based vegetation model to reconstruct the history of grassland management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; Peng, Shushi; Yue, Chao; Piao, Shilong; Wang, Tao; Hauglustaine, Didier A.; Soussana, Jean-Francois; Peregon, Anna; Kosykh, Natalya; Mironycheva-Tokareva, Nina

    2016-06-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5° by 0.5°. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 1901-2012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, rising CO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 × 106 km2 in 1901 to 12.3 × 106 km2 in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and interannual variability of grassland productivity at global scale well and thus is

  5. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    Energy Technology Data Exchange (ETDEWEB)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida, E-mail: menezes@cdtn.br, E-mail: cida@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Reator e Tecnicas Analiticas. Laboratorio de Ativacao Neutronica; Sabino, Claudia de V.S. [PUC-Minas, Belo Horizonte, MG (Brazil)

    2011-07-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, {sub k}0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  6. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  7. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  8. An introduction to Gaussian Bayesian networks.

    Science.gov (United States)

    Grzegorczyk, Marco

    2010-01-01

    The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469

  9. Reconstructing southern Greenland Ice Sheet history during the intensification of Northern Hemisphere glaciation: Insights from IODP Site U1307

    Science.gov (United States)

    Blake-Mizen, Keziah; Bailey, Ian; Carlson, Anders; Stoner, Joe; Hatfield, Rob; Xuan, Chuang; Lawrence, Kira

    2016-04-01

    Should it ever melt entirely, the Greenland Ice Sheet (GIS) would contribute to ~7 metres of global sea-level rise. Understanding how the GIS might respond to anthropogenic-induced global warming over the coming century is therefore important. Central to this goal is constraining how this ice sheet has responded to radiative forcing during both warmer- and colder-than-present climate states in the geological past. Little is known in detail, however, about the GIS prior to the Late Pleistocene and large uncertainty exists in our understanding of its history across the last great climate transition during the Cenozoic, the intensification of Northern Hemisphere glaciation (iNHG; ~3.6-2.4 Ma). This time encompasses two intervals of interest: (1) the mid-Piacenzian warm period (mPWP, ~3.3-3 Ma), widely considered an analogue for a future equilibrium climate state when atmospheric CO2 levels were comparable to modern (~400 ppmv) and sea-level and global temperatures were elevated relative to today (by ~25 metres and ~2-3°C) and, (2) a subsequent gradual deterioration in global climate and decline in atmospheric CO2 that led to the development of Quaternary-magnitude glaciations from ~2.5 Ma. Important unresolved questions include: to what extent did the southern GIS retreat during the mPWP, and when did a modern-day sized GIS first develop during iNHG? To tackle these issues our project focuses on the southern GIS history that can be extracted from Eirik Drift IODP Site U1307 between ~3.3 and 2.2 Ma. To achieve this we are developing an independent orbital-resolution age model, one of the first for high-latitude marine sediments deposited during iNHG, by producing a relative paleointensity (RPI) record for Site U1307; and generating multi-proxy geochemical and sedimentological datasets that track the provenance of the sand and bulk terrigenous sediment fraction glacially eroded by the southern GIS and delivered to the study site by both ice-rafting and the Western

  10. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  11. Galaxy formation in the Planck cosmology III: star-formation histories and post-processing magnitude reconstruction

    CERN Document Server

    Shamshiri, Sorour; Henriques, Bruno M; Tojeiro, Rita; Lemson, Gerard; Oliver, Seb J; Wilkins, Stephen

    2015-01-01

    We adapt the L-Galaxies semi-analytic model to follow the star-formation histories (SFH) of galaxies -- by which we mean a record of the formation time and metallicities of the stars that are present in each galaxy at a given time. We use these to construct stellar spectra in post-processing, which offers large efficiency savings and allows user-defined spectral bands and dust models to be applied to data stored in the Millennium data repository. We contrast model SFHs from the Millennium Simulation with observed ones from the VESPA algorithm as applied to the SDSS-7 catalogue. The overall agreement is good, with both simulated and SDSS galaxies showing a steeper SFH with increased stellar mass. The SFHs of blue and red galaxies, however, show poor agreement between data and simulations, which may indicate that the termination of star formation is too abrupt in the models. The mean star-formation rate (SFR) of model galaxies is well-defined and is accurately modelled by a double power law at all redshifts: SF...

  12. Reconstructing the aqueous history within the southwestern Melas basin, Mars: Clues from stratigraphic and morphometric analyses of fans

    Science.gov (United States)

    Williams, Rebecca M. E.; Weitz, Catherine M.

    2014-11-01

    New details of the aqueous history in the southwestern Melas Chasma elevated basin have been revealed from analysis of high-resolution image, topographic and spectral datasets. We have identified eleven fan-shaped landforms that reflect various depositional environments. A distinctive marker bed with inferred indurated aeolian bedforms is within the stratigraphic record of presumed lacustrine deposits. This observation, taken together with the stratigraphic succession of fan-shaped deposits indicates fluctuating lake levels with, at a minimum, early and late-stage lake highstands. Tributary drainage pattern in the western valley network changed from a dendritic to a meandering system, recording a shift in fluvial activity that is consistent with fluctuating lake levels. Only a few hydrated minerals have been detected in the study region, the most common being opal which appears to represent younger alteration and deposition within the basin. Landform scale was used to estimate average discharge (∼30 m3/s), formative discharge (200-300 m3/s), and fan formation timescale, which further inform the duration of lacustrine activity within the basin. Warm surface conditions and precipitation recharge of source basins is required to generate and sustain this long-lived lake over periods of at least centuries to millennia during the Late Hesperian to Early Amazonian.

  13. Reconstructing the evolutionary history of gypsy retrotransposons in the Périgord black truffle (Tuber melanosporum Vittad.).

    Science.gov (United States)

    Payen, Thibaut; Murat, Claude; Martin, Francis

    2016-08-01

    Truffles are ascomycete fungi belonging to genus Tuber, and they form ectomycorrhizal associations with trees and shrubs. Transposable elements constitute more than 50 % of the black Périgord truffle (Tuber melanosporum) genome, which are mainly class 1 gypsy retrotransposons, but their impact on its genome is unknown. The aims of this study are to investigate the diversity of gypsy retrotransposons in this species and their evolutionary history by analysing the reference genome and six resequenced genomes of different geographic accessions. Using the reverse transcriptase sequences, six different gypsy retrotransposon clades were identified. Tmt1 and Tmt6 are the most abundant transposable elements, representing 14 and 13 % of the T. melanosporum genome, respectively. Tmt6 showed a major burst of proliferation between 1 and 4 million years ago, but evidence of more recent transposition was observed. Except for Tmt2, the other clades tend to aggregate, and their mode of transposition excluded the master copy model. This suggests that each new copy has the same probability of transposing as other copies. This study provides a better view of the diversity and dynamic nature of gypsy retrotransposons in T. melanosporum. Even if the major gypsy retrotransposon bursts are old, some elements seem to have transposed recently, suggesting that they may continue to model the truffle genomes. PMID:27025914

  14. New directions in hydro-climatic histories: observational data recovery, proxy records and the atmospheric circulation reconstructions over the earth (ACRE) initiative in Southeast Asia

    Science.gov (United States)

    Williamson, Fiona; Allan, Rob; Switzer, Adam D.; Chan, Johnny C. L.; Wasson, Robert James; D'Arrigo, Rosanne; Gartner, Richard

    2015-12-01

    The value of historic observational weather data for reconstructing long-term climate patterns and the detailed analysis of extreme weather events has long been recognized (Le Roy Ladurie, 1972; Lamb, 1977). In some regions however, observational data has not been kept regularly over time, or its preservation and archiving has not been considered a priority by governmental agencies. This has been a particular problem in Southeast Asia where there has been no systematic country-by-country method of keeping or preserving such data, the keeping of data only reaches back a few decades, or where instability has threatened the survival of historic records. As a result, past observational data are fragmentary, scattered, or even absent altogether. The further we go back in time, the more obvious the gaps. Observational data can be complimented however by historical documentary or proxy records of extreme events such as floods, droughts and other climatic anomalies. This review article highlights recent initiatives in sourcing, recovering, and preserving historical weather data and the potential for integrating the same with proxy (and other) records. In so doing, it focuses on regional initiatives for data research and recovery - particularly the work of the international Atmospheric Circulation Reconstructions over the Earth's (ACRE) Southeast Asian regional arm (ACRE SEA) - and the latter's role in bringing together disparate, but interrelated, projects working within this region. The overarching goal of the ACRE SEA initiative is to connect regional efforts and to build capacity within Southeast Asian institutions, agencies and National Meteorological and Hydrological Services (NMHS) to improve and extend historical instrumental, documentary and proxy databases of Southeast Asian hydroclimate, in order to contribute to the generation of high-quality, high-resolution historical hydroclimatic reconstructions (reanalyses) and, to build linkages with humanities researchers

  15. On Fuzzy Bayesian Inference

    OpenAIRE

    Frühwirth-Schnatter, Sylvia

    1990-01-01

    In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)

  16. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  17. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    International Nuclear Information System (INIS)

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of 137Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform

  18. Inverse problems in the Bayesian framework

    International Nuclear Information System (INIS)

    The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian

  19. A Sparse Bayesian Estimation Framework for Conditioning Prior Geologic Models to Nonlinear Flow Measurements

    CERN Document Server

    Li, Lianlin

    2009-01-01

    We present a Bayesian framework for reconstruction of subsurface hydraulic properties from nonlinear dynamic flow data by imposing sparsity on the distribution of the solution coefficients in a compression transform domain.

  20. SOFOMORE: Combined EEG source and forward model reconstruction

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole;

    2009-01-01

    We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including the representat......We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including...... the representation of the cortical surface, conductivity distribution, and electrode positions. We demonstrate in both simulated and real EEG data that reconstruction of the forward model improves localization of the underlying sources....

  1. Bayesian molecular phylogenetics: estimation of divergence dates and hypothesis testing

    OpenAIRE

    Aris-Brosou, S.

    2002-01-01

    With the advent of automated sequencing, sequence data are now available to help us understand the functioning of our genome, as well as its history. To date,powerful methods such as maximum likelihood have been used to estimate its mode and tempo of evolution and its branching pattern. However, these methods appear to have some limitations. The purpose of this thesis is to examine these issues in light of Bayesian modelling, taking advantage of some recent advances in Bayesian compu...

  2. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  3. Bayesian exploratory factor analysis

    OpenAIRE

    Gabriella Conti; Sylvia Frühwirth-Schnatter; James Heckman; Rémi Piatek

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identifi cation criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study c...

  4. Bayesian Exploratory Factor Analysis

    OpenAIRE

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...

  5. Bayesian Exploratory Factor Analysis

    OpenAIRE

    Gabriella Conti; Sylvia Fruehwirth-Schnatter; Heckman, James J.; Remi Piatek

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \\emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo s...

  6. Bayesian exploratory factor analysis

    OpenAIRE

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo st...

  7. Bayesian exploratory factor analysis

    OpenAIRE

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...

  8. Nonparametric Bayesian Logic

    OpenAIRE

    Carbonetto, Peter; Kisynski, Jacek; De Freitas, Nando; Poole, David L

    2012-01-01

    The Bayesian Logic (BLOG) language was recently developed for defining first-order probability models over worlds with unknown numbers of objects. It handles important problems in AI, including data association and population estimation. This paper extends BLOG by adopting generative processes over function spaces - known as nonparametrics in the Bayesian literature. We introduce syntax for reasoning about arbitrary collections of objects, and their properties, in an intuitive manner. By expl...

  9. Bayesian default probability models

    OpenAIRE

    Andrlíková, Petra

    2014-01-01

    This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...

  10. Effects of Surgery and Chemotherapy on Metastatic Progression of Prostate Cancer: Evidence from the Natural History of the Disease Reconstructed through Mathematical Modeling

    Directory of Open Access Journals (Sweden)

    Leonid Hanin

    2011-09-01

    Full Text Available This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1 age and volume of the primary tumor at presentation; and (2 volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1 inception of the first metastasis occurred when the primary tumor was undetectable; (2 inception of all or most of the surveyed metastases occurred before the start of treatment; (3 the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4 surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer.

  11. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    Energy Technology Data Exchange (ETDEWEB)

    Boonyatumanond, Ruchaya [Environmental Research and Training Center, Pathumthani 12120 (Thailand); Wattayakorn, Gullaya [Department of Marine Science, Faculty of Science, Chulalongkorn University, Bangkok 10330 (Thailand); Amano, Atsuko [Graduate School of Science and Engineering, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Inouchi, Yoshio [Center for Marine Environmental Studies, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Takada, Hideshige [Laboratory of Organic Geochemistry, Tokyo University of Agriculture and Technology, Fuchu, Tokyo 183-8509 (Japan)]. E-mail: shige@cc.tuat.ac.jp

    2007-05-15

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of {sup 137}Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform.

  12. Effects of Surgery and Chemotherapy on Metastatic Progression of Prostate Cancer: Evidence from the Natural History of the Disease Reconstructed through Mathematical Modeling

    International Nuclear Information System (INIS)

    This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1) age and volume of the primary tumor at presentation; and (2) volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1) inception of the first metastasis occurred when the primary tumor was undetectable; (2) inception of all or most of the surveyed metastases occurred before the start of treatment; (3) the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4) surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients) in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer

  13. Bayesian Vision for Shape Recovery

    Science.gov (United States)

    Jalobeanu, Andre

    2004-01-01

    We present a new Bayesian vision technique that aims at recovering a shape from two or more noisy observations taken under similar lighting conditions. The shape is parametrized by a piecewise linear height field, textured by a piecewise linear irradiance field, and we assume Gaussian Markovian priors for both shape vertices and irradiance variables. The observation process. also known as rendering, is modeled by a non-affine projection (e.g. perspective projection) followed by a convolution with a piecewise linear point spread function. and contamination by additive Gaussian noise. We assume that the observation parameters are calibrated beforehand. The major novelty of the proposed method consists of marginalizing out the irradiances considered as nuisance parameters, which is achieved by Laplace approximations. This reduces the inference to minimizing an energy that only depends on the shape vertices, and therefore allows an efficient Iterated Conditional Mode (ICM) optimization scheme to be implemented. A Gaussian approximation of the posterior shape density is computed, thus providing estimates both the geometry and its uncertainty. We illustrate the effectiveness of the new method by shape reconstruction results in a 2D case. A 3D version is currently under development and aims at recovering a surface from multiple images, reconstructing the topography by marginalizing out both albedo and shading.

  14. The NIFTY way of Bayesian signal inference

    International Nuclear Information System (INIS)

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy

  15. Dynamic Bayesian Network Model Based Golf Swing 3D Reconstruction Using Simple Depth Imaging Device%使用简易深度成像设备的高尔夫挥杆动态贝叶斯网络三维重建

    Institute of Scientific and Technical Information of China (English)

    2015-01-01

    基于简易深度成像设备的动作捕捉系统因其与传统设备相比更加廉价且易于使用而倍受关注。然而,此类设备图像分辨率很低,肢体间互相遮挡,缺乏3维动作重建的基本数据条件。该文融合人体关节点父子关系与关节点在运动中的多阶马尔可夫性,提出一个描述人体关节点空间关系与动态特性的动态贝叶斯网络(DBN)模型,基于该 DBN 模型并利用高尔夫挥杆运动的相似性,构建了一种高尔夫挥杆3维重建系统 DBN-Motion(DBN-based Motion reconstruction system),使用简易深度成像设备Kinect,有效地解决了肢体遮挡的问题,实现了高尔夫挥杆动作的捕获和3维重建。实验结果表明,该系统能够在重建精度上媲美商用光学动作捕捉系统。%The simple depth imaging device gains more and more attention because of its lower cost and easy- to-use property compared with traditional motion capture systems. However, this kind of devices lack the basic data condition of 3D motion reconstruction due to low resolution, occlusions, and mixing up of body parts. In this paper, a Dynamic Bayesian Network (DBN) model is proposed to describe the spatial and temporal characteristics of human body joints. The model is based on fusion of the parent-child characteristics of joints and multi-order Markov property of joint during motion. A golf swing capture and reconstruction system DBN-Motion (DBN-based Motion reconstruction system), is presented based on the DBN model and the similarity of swing with a simple depth imaging device, Kinect, as capturing device. The proposed system effectively solves the problem of occlusions and mixing up of body parts, and successfully captures and reconstructs golf swing in 3D space. Experimental results prove that the proposed system can achieve comparable reconstruction accuracy to the commercial optical motion caption system.

  16. Preliminary investigation of a Bayesian network for mammographic diagnosis of breast cancer.

    OpenAIRE

    Kahn, C. E.; Roberts, L. M.; K. Wang; Jenks, D.; Haddawy, P.

    1995-01-01

    Bayesian networks use the techniques of probability theory to reason under conditions of uncertainty. We investigated the use of Bayesian networks for radiological decision support. A Bayesian network for the interpretation of mammograms (MammoNet) was developed based on five patient-history features, two physical findings, and 15 mammographic features extracted by experienced radiologists. Conditional-probability data, such as sensitivity and specificity, were derived from peer-reviewed jour...

  17. Development and sustainability of NSF-funded climate change education efforts: lessons learned and strategies used to develop the Reconstructing Earth's Climate History (REaCH) curriculum (Invited)

    Science.gov (United States)

    St John, K. K.; Jones, M. H.; Leckie, R. M.; Pound, K. S.; Krissek, L. A.

    2013-12-01

    develop detailed instructor guides to accompany each module. After careful consideration of dissemination options, we choose to publish the full suite of exercise modules as a commercially-available book, Reconstructing Earth's Climate History, while also providing open online access to a subset of modules. Its current use in undergraduate paleoclimatology courses, and the availability of select modules for use in other courses demonstrate that creative, hybrid options can be found for lasting dissemination, and thus sustainability. In achieving our goal of making science accessible, we believe we have followed a curriculum development process and sustainability path that can be used by others to meet needs in earth, ocean, and atmospheric science education. Next steps for REaCH include exploration of its use in blended learning classrooms, and at minority serving institutions.

  18. Bayesian least squares deconvolution

    Science.gov (United States)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  19. Bayesian least squares deconvolution

    CERN Document Server

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  20. Bayesian Adaptive Exploration

    CERN Document Server

    Loredo, T J

    2004-01-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...

  1. Bayesian and frequentist inequality tests

    OpenAIRE

    David M. Kaplan; Zhuo, Longhao

    2016-01-01

    Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...

  2. Bayesian multiple target tracking

    CERN Document Server

    Streit, Roy L

    2013-01-01

    This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements

  3. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the...... corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  4. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...

  5. Bayesian Filters in Practice

    Czech Academy of Sciences Publication Activity Database

    Krejsa, Jiří; Věchet, S.

    Bratislava: Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics

  6. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...

  7. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  8. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution as a...

  9. Bayesian Adaptive Exploration

    Science.gov (United States)

    Loredo, Thomas J.

    2004-04-01

    I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.

  10. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  11. Sparse Bayesian learning in ISAR tomography imaging

    Institute of Scientific and Technical Information of China (English)

    SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang

    2015-01-01

    Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.

  12. A Reconstruction of Structure of the Atom and Its Implications for General Physics Textbooks: A History and Philosophy of Science Perspective

    Science.gov (United States)

    Rodriguez, Maria A.; Niaz, Mansoor

    2004-01-01

    Recent research in science education has recognized the importance of history and philosophy of science. The objective of this study is to evaluate the presentation of the Thomson, Rutherford, and Bohr models of the atom in general physics textbooks based on criteria derived from history and philosophy of science. Forty-one general physics…

  13. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  14. Uncertainty estimation in reconstructed deformable models

    Energy Technology Data Exchange (ETDEWEB)

    Hanson, K.M.; Cunningham, G.S.; McKee, R.

    1996-12-31

    One of the hallmarks of the Bayesian approach to modeling is the posterior probability, which summarizes all uncertainties regarding the analysis. Using a Markov Chain Monte Carlo (MCMC) technique, it is possible to generate a sequence of objects that represent random samples drawn from the posterior distribution. We demonstrate this technique for reconstructions of two-dimensional objects from noisy projections taken from two directions. The reconstructed object is modeled in terms of a deformable geometrically-defined boundary with a constant interior density yielding a nonlinear reconstruction problem. We show how an MCMC sequence can be used to estimate uncertainties in the location of the edge of the reconstructed object.

  15. Reconstructing baryon oscillations

    OpenAIRE

    Noh, Yookyung; White, Martin; Padmanabhan, Nikhil

    2009-01-01

    The baryon acoustic oscillation (BAO) method for constraining the expansion history is adversely affected by non-linear structure formation, which washes out the correlation function peak created at decoupling. To increase the constraining power of low z BAO experiments, it has been proposed that one use the observed distribution of galaxies to "reconstruct'' the acoustic peak. Recently Padmanabhan, White and Cohn provided an analytic formalism for understanding how reconstruction works withi...

  16. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  17. Bayesian Magic in Asteroseismology

    Science.gov (United States)

    Kallinger, T.

    2015-09-01

    Only a few years ago asteroseismic observations were so rare that scientists had plenty of time to work on individual data sets. They could tune their algorithms in any possible way to squeeze out the last bit of information. Nowadays this is impossible. With missions like MOST, CoRoT, and Kepler we basically drown in new data every day. To handle this in a sufficient way statistical methods become more and more important. This is why Bayesian techniques started their triumph march across asteroseismology. I will go with you on a journey through Bayesian Magic Land, that brings us to the sea of granulation background, the forest of peakbagging, and the stony alley of model comparison.

  18. Bayesian Nonparametric Graph Clustering

    OpenAIRE

    Banerjee, Sayantan; Akbani, Rehan; Baladandayuthapani, Veerabhadran

    2015-01-01

    We present clustering methods for multivariate data exploiting the underlying geometry of the graphical structure between variables. As opposed to standard approaches that assume known graph structures, we first estimate the edge structure of the unknown graph using Bayesian neighborhood selection approaches, wherein we account for the uncertainty of graphical structure learning through model-averaged estimates of the suitable parameters. Subsequently, we develop a nonparametric graph cluster...

  19. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111. ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  20. Bayesian Benchmark Dose Analysis

    OpenAIRE

    Fang, Qijun; Piegorsch, Walter W.; Barnes, Katherine Y.

    2014-01-01

    An important objective in environmental risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indee...

  1. Bayesian Generalized Rating Curves

    OpenAIRE

    Helgi Sigurðarson 1985

    2014-01-01

    A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...

  2. Heteroscedastic Treed Bayesian Optimisation

    OpenAIRE

    Assael, John-Alexander M.; Wang, Ziyu; Shahriari, Bobak; De Freitas, Nando

    2014-01-01

    Optimising black-box functions is important in many disciplines, such as tuning machine learning models, robotics, finance and mining exploration. Bayesian optimisation is a state-of-the-art technique for the global optimisation of black-box functions which are expensive to evaluate. At the core of this approach is a Gaussian process prior that captures our belief about the distribution over functions. However, in many cases a single Gaussian process is not flexible enough to capture non-stat...

  3. Efficient Bayesian Phase Estimation

    Science.gov (United States)

    Wiebe, Nathan; Granade, Chris

    2016-07-01

    We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.

  4. Bayesian Word Sense Induction

    OpenAIRE

    Brody, Samuel; Lapata, Mirella

    2009-01-01

    Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...

  5. Bayesian Neural Word Embedding

    OpenAIRE

    Barkan, Oren

    2016-01-01

    Recently, several works in the domain of natural language processing presented successful methods for word embedding. Among them, the Skip-gram (SG) with negative sampling, known also as Word2Vec, advanced the state-of-the-art of various linguistics tasks. In this paper, we propose a scalable Bayesian neural word embedding algorithm that can be beneficial to general item similarity tasks as well. The algorithm relies on a Variational Bayes solution for the SG objective and a detailed step by ...

  6. Bayesian Attractor Learning

    Science.gov (United States)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  7. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  8. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne;

    2011-01-01

    -stage procedures. From the Danish Registry for Plastic Surgery of the Breast, which has prospectively registered data for women undergoing breast implantations since 1999, we identified 559 women without a history of radiation therapy undergoing 592 delayed breast reconstructions following breast cancer during the......Studies of complications following reconstructive surgery with implants among women with breast cancer are needed. As the, to our knowledge, first prospective long-term study we evaluated the occurrence of complications following delayed breast reconstruction separately for one- and two...

  9. Three-dimensional simulation and inversion of borehole temperatures for reconstructing past climate in complex settings

    Science.gov (United States)

    Hopcroft, Peter O.; Gallagher, Kerry; Pain, Christopher C.; Fang, Fangxin

    2009-06-01

    The majority of inversion methods used for inferring past ground surface temperatures (GST) from borehole temperature-depth profiles rely on the assumption that heat flow is in the vertical direction only. This means that accounting for certain effects caused by the local terrain of a borehole is not possible and consequently, many borehole profiles cannot be used with confidence. Here, we describe a methodology to avoid this problem by solving the heat conduction forward problem in 3-D using finite elements (FE). In order to make the inversion approach computationally tractable, we reduce the dimensions of this FE model using proper orthogonal decomposition. The inverse problem is cast in a probabilistic Bayesian framework for which the posterior probability distribution of the past GSTs is sampled using a reversible jump Markov chain Monte Carlo algorithm. This allows the resolution of the GST history over time to be explored by varying the parameterization of the GST model. Synthetic examples calculated with moderate topographies demonstrate the efficacy of the Bayesian 3-D inversion method, and the results are compared with those using a 1-D approach. For moderate topography, the latter can lead to spurious GST reconstructions. A further synthetic example demonstrates that the effect of incorrectly assuming lateral geological homogeneity is negligible. The inversion method is also compared with a more standard inversion method. A significant advantage of the Bayesian approach is that uncertainties in all of the model parameters can be accounted for, leading to a more realistic interpretation of the range of GST histories supported by the data. The methods presented here should allow a broader range of geothermal data to be used for paleoclimate reconstruction purposes in the future.

  10. Computational Imaging for VLBI Image Reconstruction

    OpenAIRE

    Bouman, Katherine L.; Johnson, Michael D; Zoran, Daniel; Fish, Vincent L.; Doeleman, Sheperd S.; Freeman, William T.

    2016-01-01

    Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other m...

  11. Computational Imaging for VLBI Image Reconstruction

    OpenAIRE

    Bouman, Katherine L.; Johnson, Michael D; Zoran, Daniel; Fish, Vincent L.; Doeleman, Sheperd Samuel; Freeman, William T.

    2016-01-01

    Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other ...

  12. Unbounded Bayesian Optimization via Regularization

    OpenAIRE

    Shahriari, Bobak; Bouchard-Côté, Alexandre; De Freitas, Nando

    2015-01-01

    Bayesian optimization has recently emerged as a popular and efficient tool for global optimization and hyperparameter tuning. Currently, the established Bayesian optimization practice requires a user-defined bounding box which is assumed to contain the optimizer. However, when little is known about the probed objective function, it can be difficult to prescribe such bounds. In this work we modify the standard Bayesian optimization framework in a principled way to allow automatic resizing of t...

  13. Bayesian optimization for materials design

    OpenAIRE

    Frazier, Peter I.; Wang, Jialei

    2015-01-01

    We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...

  14. From the history of the recognitions of the remains to the reconstruction of the face of Dante Alighieri by means of techniques of virtual reality and forensic anthropology

    Directory of Open Access Journals (Sweden)

    Stefano Benazzi

    2007-07-01

    Full Text Available The work consists of the reconstruction of the face of the great poet called Dante Alighieri through a multidisciplinary approach that matches traditional techniques (manual ones, usually used in forensic anthropology, with digital methodologies that take advantage of technologies born in manufacturer-military fields but that are more and more often applied to the field of the cultural heritage. Unable to get the original skull of Dante, the work started from the data and the elements collected by Fabio Frassetto and Giuseppe Sergi, two important anthropologists, respectively at the University of Bologna and Rome, in an investigation carried out in 1921, sixth century anniversary of his death, on the remains of the poet collected in Ravenna. Thanks to this, we have a very accurate description of Dante’s bones, including 297 metric data inherent to the whole skeleton, some photographs in the scale of the skull, the various norms and many other bones, as well as a model of the skull subsequently realized by Frassetto. According to these information, a geometric reconstruction of Dante Alighieri skull including the jaw was carried out through the employment and integration of the instruments and technologies of the virtual reality, and from this the relative physical model through fast prototype was realized. An important aspect of the work regards in a particular manner the methodology of 3D modelling proposed for the new reconstruction of the jaw (not found in the course of the 1921 recognition, starting from a reference model. The model of the skull prototype is then useful as the basis for the successive stage of facial reconstruction through the traditional techniques of forensic art.

  15. Old Czechs were hefty heroes: the construction and reconstruction of Czech national history in its relationship to the "great" medieval past”

    Czech Academy of Sciences Publication Activity Database

    Šmahel, František

    Basingstoke, Hampshire: Palgrave Macmillan, 2011 - (Evans, R.; Marchal, G.), s. 245-258 ISBN 978-0-230-57602-5 Institutional research plan: CEZ:AV0Z90090514 Keywords : Historiography * hussitism * John Žižka * counterfeit manuscripts * Emperor Charles IV Subject RIV: AB - History

  16. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  17. Decentralized Distributed Bayesian Estimation

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Sečkárová, Vladimíra

    Praha: ÚTIA AVČR, v.v.i, 2011 - (Janžura, M.; Ivánek, J.). s. 16-16 [7th International Workshop on Data–Algorithms–Decision Making. 27.11.2011-29.11.2011, Mariánská] R&D Projects: GA ČR 102/08/0567; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : estimation * distributed estimation * model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/AS/dedecius-decentralized distributed bayesian estimation.pdf

  18. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  19. Computationally efficient Bayesian tracking

    Science.gov (United States)

    Aughenbaugh, Jason; La Cour, Brian

    2012-06-01

    In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.

  20. Improved iterative Bayesian unfolding

    CERN Document Server

    D'Agostini, G

    2010-01-01

    This paper reviews the basic ideas behind a Bayesian unfolding published some years ago and improves their implementation. In particular, uncertainties are now treated at all levels by probability density functions and their propagation is performed by Monte Carlo integration. Thus, small numbers are better handled and the final uncertainty does not rely on the assumption of normality. Theoretical and practical issues concerning the iterative use of the algorithm are also discussed. The new program, implemented in the R language, is freely available, together with sample scripts to play with toy models.

  1. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  2. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  3. Adaptive Dynamic Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Ng, B M

    2007-10-26

    A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.

  4. Bayesian analysis toolkit - BAT

    International Nuclear Information System (INIS)

    Statistical treatment of data is an essential part of any data analysis and interpretation. Different statistical methods and approaches can be used, however the implementation of these approaches is complicated and at times inefficient. The Bayesian analysis toolkit (BAT) is a software package developed in C++ framework that facilitates the statistical analysis of the data using Bayesian theorem. The tool evaluates the posterior probability distributions for models and their parameters using Markov Chain Monte Carlo which in turn provide straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as simulated annealing, allow extraction of the global mode of the posterior. BAT sets a well-tested environment for flexible model definition and also includes a set of predefined models for standard statistical problems. The package is interfaced to other software packages commonly used in high energy physics, such as ROOT, Minuit, RooStats and CUBA. We present a general overview of BAT and its algorithms. A few physics examples are shown to introduce the spectrum of its applications. In addition, new developments and features are summarized.

  5. Bayesian inference tools for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2013-08-01

    In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and

  6. Character State Reconstruction of Call Diversity in the Neoconocephalus Katydids Reveals High Levels of Convergence.

    Science.gov (United States)

    Frederick, Katy; Schul, Johannes

    2016-01-01

    The katydid genus Neoconocephalus is characterized by high diversity of the acoustic communication system. Both male signals and female preferences have been thoroughly studied in the past. This study used Bayesian character state reconstruction to elucidate the evolutionary history of diverse call traits, based on an existing, well supported phylogenetic hypothesis. The most common male call pattern consisted of continuous calls comprising one fast pulse rate; this pattern is the likely ancestral state in this genus. Three lines of call divergence existed among the species of the genus. First, four species had significantly slower pulse rates. Second, five species had alternating pulse periods, resulting in a double pulse rhythm. Third, several species had discontinuous calls, when pulses were grouped into rhythmically repeated verses. Bayesian character state reconstruction revealed that the double-pulse pattern likely evolved convergently five times; the slow pulse rate also evolved four times independently. Discontinuous calls have evolved twice and occur in two clades; each of which contains reversals to the ancestral continuous calls. Pairwise phylogenetically independent contrast analyses among the three call traits found no significant correlations among the character states of the different traits, supporting the independent evolution of the three call traits. PMID:27110432

  7. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  8. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... represents the spatial coordinates of the grid nodes. Knowledge of how grid nodes are depicted in the observed image is described through the observation model. The prior consists of a node prior and an arc (edge) prior, both modeled as Gaussian MRFs. The node prior models variations in the positions of grid...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  9. Analysis of trace elements in the shells of short-necked clam Ruditapes philippinarum (Mollusca: Bivalvia) with respect to reconstruction of individual life history

    International Nuclear Information System (INIS)

    Strontium (Sr) concentration in the shells of short-necked clams collected at different locations (Shirahama, warm area and Maizuru, cold area, Japan) was analyzed by two methods, PIXE and EPMA. The Sr concentration of external surface of shell umbo, which was made during short term at early benthic phase, was analyzed by PIXE, and was ranged from 1000 to 3500 ppm for individuals. The Sr concentration of clams collected at Shirahama showed positive correlation with shell length (SL) in individuals with SL < 31 mm, whereas clams collected at Maizuru did not show significant correlation. This result may be caused from the difference of the spawning seasons between two areas. The Sr concentration of cross section of shell umbo, which develops thicker continuously during their life to form faint stratum structure, was analyzed by EPMA along the line across the stratum structure. Some surges and long term waving patterns of the Sr concentration were observed. These results suggest that the life histories of individual clams could be recorded in the shell umbo cross sections as variations of trace elements and analyses of trace elements could clarify the histories of individual clams. (author)

  10. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  11. Reconstruction of the Transmission History of RNA Virus Outbreaks Using Full Genome Sequences: Foot-and-Mouth Disease Virus in Bulgaria in 2011

    DEFF Research Database (Denmark)

    Valdazo-González, Begoña; Polihronova, Lilyana; Alexandrov, Tsviatko;

    2012-01-01

    the origin and transmission history of the FMD outbreaks which occurred during 2011 in Burgas Province, Bulgaria, a country that had been previously FMD-free-without-vaccination since 1996. Nineteen full genome sequences (FGS) of FMD virus (FMDV) were generated and analysed, including eight representative...... identified in wild boar. The closest relative from outside of Bulgaria was a FMDV collected during 2010 in Bursa (Anatolia, Turkey). Within Bulgaria, two discrete genetic clusters were detected that corresponded to two episodes of outbreaks that occurred during January and March-April 2011. The number...... of nucleotide substitutions that were present between, and within, these separate clusters provided evidence that undetected FMDV infection had occurred. These conclusions are supported by laboratory data that subsequently identified three additional FMDV-infected livestock premises by serosurveillance, as well...

  12. Unfavourable results in thumb reconstruction

    Directory of Open Access Journals (Sweden)

    Samir M Kumta

    2013-01-01

    Full Text Available The history of thumb reconstruction parallels the history of hand surgery. The attributes that make the thumb unique, and that the reconstructive surgeon must assess and try to restore when reconstructing a thumb, are: Position, stability, strength, length, motion, sensibility and appearance. Deficiency in any of these attributes can reduce the utility of the reconstructed thumb. A detailed assessment of the patient and his requirements needs to be performed before embarking on a thumb reconstruction. Most unsatisfactory results can be attributed to wrong choice of procedure. Component defects of the thumb are commonly treated by tissue from adjacent fingers, hand or forearm. With refinements in microsurgery, the foot has become a major source of tissue for component replacement in the thumb. Bone lengthening, osteoplastic reconstruction, pollicisation, and toe to hand transfers are the commonest methods of thumb reconstruction. Unfavourable results can be classified as functional and aesthetic. Some are common to all types of procedures. However each type of reconstruction has its own unique set of problems. Meticulous planning and execution is essential to give an aesthetic and functionally useful thumb. Secondary surgeries like tendon transfers, bone grafting, debulking, arthrodesis, may be required to correct deficiencies in the reconstruction. Attention needs to be paid to the donor site as well.

  13. History of human activity in last 800 years reconstructed from combined archive data and high-resolution analyses of varved lake sediments from Lake Czechowskie, Northern Poland

    Science.gov (United States)

    Słowiński, Michał; Tyszkowski, Sebastian; Ott, Florian; Obremska, Milena; Kaczmarek, Halina; Theuerkauf, Martin; Wulf, Sabine; Brauer, Achim

    2016-04-01

    The aim of the study was to reconstruct human and landscape development in the Tuchola Pinewoods (Northern Poland) during the last 800 years. We apply an approach that combines historic maps and documents with pollen data. Pollen data were obtained from varved lake sediments at a resolution of 5 years. The chronology of the sediment record is based on varve counting, AMS 14C dating, 137Cs activity concentration measurements and tephrochronology (Askja AD 1875). We applied the REVEALS model to translate pollen percentage data into regional plant abundances. The interpretation of the pollen record is furthermore based on pollen accumulation rate data. The pollen record and historic documents show similar trends in vegetation development. During the first phase (AD 1200-1412), the Lake Czechowskie area was still largely forested with Quercus, Carpinus and Pinus forests. Vegetation was more open during the second phase (AD 1412-1776), and reached maximum openness during the third phase (AD 1776-1905). Furthermore, intensified forest management led to a transformation from mixed to pine dominated forests during this period. Since the early 20th century, the forest cover increased again with dominance of the Scots pine in the stand. While pollen and historic data show similar trends, they differ substantially in the degree of openness during the four phases with pollen data commonly suggesting more open conditions. We discuss potential causes for this discrepancy, which include unsuitable parameters settings in REVEALS and unknown changes in forest structure. Using pollen accumulation data as a third proxy record we aim to identify the most probable causes. Finally, we discuss the observed vegetation change in relation the socio-economic development of the area. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution Analysis - ICLEA- of the Helmholtz Association and National Science Centre, Poland (grant No. 2011/01/B/ST10

  14. Bayesian Methods and Universal Darwinism

    OpenAIRE

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of...

  15. Portfolio Allocation for Bayesian Optimization

    OpenAIRE

    Brochu, Eric; Hoffman, Matthew W.; De Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It uses Bayesian methods to sample the objective efficiently using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several differen...

  16. Neuronanatomy, neurology and Bayesian networks

    OpenAIRE

    Bielza Lozoya, Maria Concepcion

    2014-01-01

    Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...

  17. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  18. Bayesian Interpretations of Heteroskedastic Consistent Covariance Estimators Using the Informed Bayesian Bootstrap

    OpenAIRE

    Dale Poirier

    2008-01-01

    This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.

  19. Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs

    OpenAIRE

    Mansinghka, Vikash K.; Kulkarni, Tejas D.; Perov, Yura N.; Tenenbaum, Joshua B.

    2013-01-01

    The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to automatically invert them to interpret real-world images. Generative probabilistic graphics program...

  20. Paleoenvironmental reconstruction of a downslope accretion history: From coralgal-coralline sponge rubble to mud mound deposits (Eocene, Ainsa Basin, Spain)

    Science.gov (United States)

    Rodríguez-Martínez, Marta; Reitner, Joachim

    2015-12-01

    In the Lutetian intraslope Ainsa sub-basin, small, sub-spherical, carbonate mud mounds occur associated with hemipelagic marls and mixed gravity flow deposits. The studied mud mounds consist of a mixture of allochthonous, parautochthonous and autochthonous components that show evidences of reworking, bioerosion, and accretion by different fossil assemblages at different growth stages. The crusts of microbial-lithistid sponges played an important role stabilizing the rubble of coralgal-coralline sponges and formed low-relief small benthic patches in a dominant marly soft slope environment. These accidental hard substrates turned into suitable initiation/nucleation sites for automicrite production (dense and peloidal automicrites) on which the small mud mounds dominated by opportunistic epi- and infaunal heterozoan assemblages grew. A detailed microfacies mapping and paleoenvironmental analysis reveals a multi-episodic downslope accretion history starred by demosponges (coralline and lithistid sponges), agariciid corals, calcareous red algae, putative microbial benthic communities and diverse sclerobionts from the upper slope to the middle slope. The analyzed mud mound microfacies are compared with similar fossil assemblages and growth fabrics described in many fossil mud mounds, and with recent deep-water fore reefs and cave environments.

  1. 中国心理学思想史范畴体系的重建%Reconstruction of Category's System of History of Chinese Psychological Thoughts

    Institute of Scientific and Technical Information of China (English)

    彭彦琴

    2001-01-01

    中国心理学思想史理论层次的进一步提升呼唤着新的范畴体系的建构。本文分析了已有范畴说存在的不足,提出了范畴建构的几条原则,最后详尽解说了在天人合一背景中的以人性为元范畴所构建的中国心理学思想史范畴体系,它向人们展示了其内在的逻辑脉络,精深内蕴及鲜明个性。%The elevation of theory of history of chinese psychological thoughts calls the construction of new category's system. This paper analyses the short coming of old category' s construction, explaining the category's system of Chinese psychological thoughts that is based on human.n nature - metacategory whose background is oneness heaven and man, that displays its internal logical vein. profound implication and distinct character.

  2. Reconstructing reticulation history in a phylogenetic framework and the potential of allopatric speciation driven by polyploidy in an agamic complex in Crataegus (Rosaceae).

    Science.gov (United States)

    Lo, Eugenia Y Y; Stefanović, Saša; Dickinson, Timothy A

    2010-12-01

    Polyploidy plays a prominent role in the speciation process in plants. Many species are known to be part of agamic complexes comprising sexual diploids and more or less exclusively asexual polyploids. However, polyploid formation has been studied in very few cases, primarily because of the challenges in examining these cases phylogenetically. In this study, we demonstrate the use of a variety of phylogenetic approaches to unravel origins and infer reticulation history in a diploid-polyploid complex of black-fruited Crataegus. The tree approaches are shown to be useful in testing alternative hypotheses and in revealing genealogies of nuclear genes, particularly in polyploid organisms that may contain multiple copies. Compared to trees, network approaches provide a better indication of reticulate relationships among recently diverged taxa. Taken together, our data point to both the autopolyploid and allopolyploid origins of triploids in natural populations of Crataegus suksdorfii, whereas tetraploids are formed via a triploid bridge, involving the backcross of allotriploid offspring with their diploid C. suksdorfii parent, followed by gene introgression from sympatric C. douglasii. Our findings provide empirical evidence for different pathways of polyploid formation that are all likely to occur within natural populations and the allopatric establishment of neopolyploids subsequent to their formation. PMID:20561052

  3. Bayesian Kinematic Finite Fault Source Models (Invited)

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.

    2010-12-01

    Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.

  4. History of land use in India during 1880-2010: Large-scale land transformations reconstructed from satellite data and historical archives

    Science.gov (United States)

    Tian, Hanqin; Banger, Kamaljit; Bo, Tao; Dadhwal, Vinay K.

    2014-10-01

    In India, human population has increased six-fold from 200 million to 1200 million that coupled with economic growth has resulted in significant land use and land cover (LULC) changes during 1880-2010. However, large discrepancies in the existing LULC datasets have hindered our efforts to better understand interactions among human activities, climate systems, and ecosystem in India. In this study, we incorporated high-resolution remote sensing datasets from Resourcesat-1 and historical archives at district (N = 590) and state (N = 30) levels to generate LULC datasets at 5 arc minute resolution during 1880-2010 in India. Results have shown that a significant loss of forests (from 89 million ha to 63 million ha) has occurred during the study period. Interestingly, the deforestation rate was relatively greater under the British rule (1880-1950s) and early decades after independence, and then decreased after the 1980s due to government policies to protect the forests. In contrast to forests, cropland area has increased from 92 million ha to 140.1 million ha during 1880-2010. Greater cropland expansion has occurred during the 1950-1980s that coincided with the period of farm mechanization, electrification, and introduction of high yielding crop varieties as a result of government policies to achieve self-sufficiency in food production. The rate of urbanization was slower during 1880-1940 but significantly increased after the 1950s probably due to rapid increase in population and economic growth in India. Our study provides the most reliable estimations of historical LULC at regional scale in India. This is the first attempt to incorporate newly developed high-resolution remote sensing datasets and inventory archives to reconstruct the time series of LULC records for such a long period in India. The spatial and temporal information on LULC derived from this study could be used by ecosystem, hydrological, and climate modeling as well as by policy makers for assessing the

  5. Research of Gene Regulatory Network with Multi-Time Delay Based on Bayesian Network

    Institute of Scientific and Technical Information of China (English)

    LIU Bei; MENG Fanjiang; LI Yong; LIU Liyan

    2008-01-01

    The gene regulatory network was reconstructed according to time-series microarray data getting from hybridization at different time between gene chips to analyze coordination and restriction between genes. An algorithm for controlling the gene expression regulatory network of the whole cell was designed using Bayesian network which provides an effective aided analysis for gene regulatory network.

  6. Evaluation of Bayesian tensor estimation using tensor coherence

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae-Jin; Park, Hae-Jeong [Laboratory of Molecular Neuroimaging Technology, Brain Korea 21 Project for Medical Science, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, In-Young [Department of Biomedical Engineering, Hanyang University, Seoul (Korea, Republic of); Jeong, Seok-Oh [Department of Statistics, Hankuk University of Foreign Studies, Yongin (Korea, Republic of)], E-mail: parkhj@yuhs.ac

    2009-06-21

    Fiber tractography, a unique and non-invasive method to estimate axonal fibers within white matter, constructs the putative streamlines from diffusion tensor MRI by interconnecting voxels according to the propagation direction defined by the diffusion tensor. This direction has uncertainties due to the properties of underlying fiber bundles, neighboring structures and image noise. Therefore, robust estimation of the diffusion direction is essential to reconstruct reliable fiber pathways. For this purpose, we propose a tensor estimation method using a Bayesian framework, which includes an a priori probability distribution based on tensor coherence indices, to utilize both the neighborhood direction information and the inertia moment as regularization terms. The reliability of the proposed tensor estimation was evaluated using Monte Carlo simulations in terms of accuracy and precision with four synthetic tensor fields at various SNRs and in vivo human data of brain and calf muscle. Proposed Bayesian estimation demonstrated the relative robustness to noise and the higher reliability compared to the simple tensor regression.

  7. Sparse Bayesian Learning for DOA Estimation with Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Jisheng Dai

    2015-10-01

    Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.

  8. Nonparametric Bayesian Classification

    CERN Document Server

    Coram, M A

    2002-01-01

    A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...

  9. BAT - Bayesian Analysis Toolkit

    International Nuclear Information System (INIS)

    One of the most vital steps in any data analysis is the statistical analysis and comparison with the prediction of a theoretical model. The many uncertainties associated with the theoretical model and the observed data require a robust statistical analysis tool. The Bayesian Analysis Toolkit (BAT) is a powerful statistical analysis software package based on Bayes' Theorem, developed to evaluate the posterior probability distribution for models and their parameters. It implements Markov Chain Monte Carlo to get the full posterior probability distribution that in turn provides a straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as Simulated Annealing, allow to evaluate the global mode of the posterior. BAT is developed in C++ and allows for a flexible definition of models. A set of predefined models covering standard statistical cases are also included in BAT. It has been interfaced to other commonly used software packages such as ROOT, Minuit, RooStats and CUBA. An overview of the software and its algorithms is provided along with several physics examples to cover a range of applications of this statistical tool. Future plans, new features and recent developments are briefly discussed.

  10. Genetic characterisation of Porcine circovirus type 2 (PCV2) strains from feral pigs in the Brazilian Pantanal: An opportunity to reconstruct the history of PCV2 evolution.

    Science.gov (United States)

    Franzo, Giovanni; Cortey, Martí; de Castro, Alessandra Marnie Martins Gomes; Piovezan, Ubiratan; Szabo, Matias Pablo Juan; Drigo, Michele; Segalés, Joaquim; Richtzenhain, Leonardo José

    2015-07-01

    Since its discovery, Porcine circovirus type 2 has emerged as one of the most relevant swine infectious diseases, causing relevant economic losses for the pig industry. While four genotypes were identified, only three (PCV2a, PCV2b and PCV2d) are currently circulating and display a worldwide distribution. Another genotype, PCV2c, has been described only once in Danish archive samples collected between 1980 and 1990. In addition to commercial pigs, PCV2 has been demonstrated to infect wild boars and other wild species, which can potentially serve as a reservoir for domestic populations. In this study, eight sequences obtained from feral pigs in the Pantanal region (Mato Grosso do Sul State, Brazil) were compared with reference sequences and other Brazilian sequences, and the results revealed remarkable genetic diversity, with all four genotypes currently recognised being detected (PCV2a, PCV2b, PCV2c and PCV2d). This finding represents a remarkable discovery, as it is the first detection of PCV2c since 1990 and the first-ever detection of PCV2c in live animals. The peculiar population history and ecological scenario of feral pigs in the Pantanal coupled with the complex, and still only partially known relationship of feral pigs with other PCV2 susceptible species (i.e., domestic pigs, wild boars and peccaries), open exciting questions concerning PCV2 origin and evolution. Overall, the results of the present study led us to form the following hypothesis: the PCV2 strains found in feral pigs may be the last descent of the strains that circulated among European pigs in the past, or they may have infected these feral pigs more recently through a bridge species. PMID:25975522

  11. Reconstruction of the Earthquake History of Limestone Fault Scarps in Knidos Fault Zone Using in-situ Chlorine-36 Exposure Dating and "R" Programming Language

    Science.gov (United States)

    Sahin, Sefa; Yildirim, Cengiz; Akif Sarikaya, Mehmet; Tuysuz, Okan; Genc, S. Can; Ersen Aksoy, Murat; Ertekin Doksanalti, Mustafa

    2016-04-01

    Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of the fault, topography, geographic location of the study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Recently developed models provides a method to infer timing of earthquakes and slip rates on limited scales by taking into account these parameters. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to generate cosmogenic chlorine-36 (36Cl) dating models. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. The main purpose of this study is to analyze factors affecting the production rates and amount of cosmogenic 36Cl nuclides concentration. Concentration of Cl36 isotopes are measured by AMS laboratories. Through the local production rates and concentration of the cosmic isotopes, we can calculate exposure ages of the samples. Recent research elucidated each step of the application of this method by the Matlab programming language (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models of Quaternary activity of the normal faults. We, however, wanted to build a user-friendly program through an open source programing language "R" (GNU Project) that might be able to help those without knowledge of complex math programming, making calculations as easy and understandable as

  12. Complementarity of statistical treatments to reconstruct worldwide routes of invasion: The case of the Asian ladybird Harmonia axyridis

    Science.gov (United States)

    Technical Abstract. Molecular markers can provide clear insight into the introduction history of invasive species. However, inferences about recent introduction histories remain challenging, because of the stochastic demographic processes often involved. Approximate Bayesian computation (ABC) can he...

  13. Coenocline reconstruction using graph theory and Bayesian probability data generator

    Czech Academy of Sciences Publication Activity Database

    Čejchan, Petr

    Krakow : Royal Society, 2007. s. 1-2. [United Kingdom -Visegrad Frontiers of Science Symposium /4./. 21.02.2007-23.02.2007, Krakow] Institutional research plan: CEZ:AV0Z30130516 Keywords : coenocline * gradient anaysis * palaeoecology Subject RIV: DB - Geology ; Mineralogy

  14. Reconstructing the Tengger calendar

    Directory of Open Access Journals (Sweden)

    Ian Proudfoot

    2008-12-01

    Full Text Available The survival of an Indic calendar among the Tengger people of the Brama highlands in east Java opens a window on Java’s calendar history. Its hybrid form reflects accommodations between this non-Muslim Javanese group and the increasingly dominant Muslim Javanese culture. Reconstruction is challenging because of this hybridity, because of inconsistencies in practice, and because the historical evidence is sketchy and often difficult to interpret.

  15. Reconstructing the Tengger calendar

    OpenAIRE

    Ian Proudfoot

    2008-01-01

    The survival of an Indic calendar among the Tengger people of the Brama highlands in east Java opens a window on Java’s calendar history. Its hybrid form reflects accommodations between this non-Muslim Javanese group and the increasingly dominant Muslim Javanese culture. Reconstruction is challenging because of this hybridity, because of inconsistencies in practice, and because the historical evidence is sketchy and often difficult to interpret.

  16. The subjectivity of scientists and the Bayesian statistical approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  17. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  18. Penile reconstruction

    OpenAIRE

    Garaffa, Giulio; Sansalone, Salvatore; Ralph, David J.

    2012-01-01

    During the most recent years, a variety of new techniques of penile reconstruction have been described in the literature. This paper focuses on the most recent advances in male genital reconstruction after trauma, excision of benign and malignant disease, in gender reassignment surgery and aphallia with emphasis on surgical technique, cosmetic and functional outcome.

  19. Penile reconstruction

    Institute of Scientific and Technical Information of China (English)

    Giulio Garaffa; Salvatore Sansalone; David J Ralph

    2013-01-01

    During the most recent years,a variety of new techniques of penile reconstruction have been described in the literature.This paper focuses on the most recent advances in male genital reconstruction after trauma,excision of benign and malignant disease,in gender reassignment surgery and aphallia with emphasis on surgical technique,cosmetic and functional outcome.

  20. Iterative image reconstruction in ECT

    International Nuclear Information System (INIS)

    A series of preliminary studies has been performed in the authors laboratories to explore the use of a priori information in Bayesian image restoration and reconstruction. One piece of a priori information is the fact that intensities of neighboring pixels tend to be similar if they belong to the same region within which similar tissue characteristics are exhibited. this property of local continuity can be modeled by the use of Gibbs priors, as first suggested by German and Geman. In their investigation, they also included line sites between each pair of neighboring pixels in the Gibbs prior and used discrete binary numbers to indicate the absence or presence of boundaries between regions. These two features of the a priori model permit averaging within boundaries of homogeneous regions to alleviate the degradation caused by Poisson noise. with the use of this Gibbs prior in combination with the technique of stochastic relaxation, German and Geman demonstrated that noise levels can be reduced significantly in 2-D image restoration. They have developed a Bayesian method that utilizes a Gibbs prior to describe the spatial correlation of neighboring regions and takes into account the effect of limited spatial resolution as well. The statistical framework of the proposed approach is based on the data augmentation scheme suggested by Tanner and Wong. Briefly outlined here, this Bayesian method is based on Geman and Geman's approach

  1. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  2. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  3. Bayesian Methods and Universal Darwinism

    CERN Document Server

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...

  4. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  5. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  6. Bayesian test and Kuhn's paradigm

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoping

    2006-01-01

    Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.

  7. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  8. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  9. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup

  10. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    C. Dimitrakakis

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st

  11. Human migration patterns in Yemen and implications for reconstructing prehistoric population movements.

    Directory of Open Access Journals (Sweden)

    Aida T Miró-Herrans

    Full Text Available Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102 and mean and median distances of migration (96 km and 26 km for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing

  12. Belief-propagation reconstruction for discrete tomography

    International Nuclear Information System (INIS)

    We consider the reconstruction of a two-dimensional discrete image from a set of tomographic measurements corresponding to the Radon projection. Assuming that the image has a structure where neighbouring pixels have a larger probability of taking the same value, we follow a Bayesian approach and introduce a fast message-passing reconstruction algorithm based on belief propagation. For numerical results, we specialize to the case of binary tomography. We test the algorithm on binary synthetic images with different length scales and compare our results against a more usual convex optimization approach. We investigate the reconstruction error as a function of the number of tomographic measurements, corresponding to the number of projection angles. The belief-propagation algorithm turns out to be more efficient than the convex-optimization algorithm, both in terms of recovery bounds for noise-free projections and reconstruction quality when moderate Gaussian noise is added to the projections. (paper)

  13. Climate Reconstructions

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Paleoclimatology Program archives reconstructions of past climatic conditions derived from paleoclimate proxies, in addition to the Program's large...

  14. Laryngopharyngeal reconstruction

    OpenAIRE

    Kazi, Rehan A

    2006-01-01

    There is a high incidence of hypopharyngeal cancer is our country due to the habits of tobacco and alcohol. Moreover these cases are often detected in the late stages thereby making the issue of reconstruction very tedious and unpredictable. There are a number of options for laryngopharyngeal reconstruction available now including the use of microvascular flaps depending upon the patient’s fitness, motivation, technical expertise, size and extent of the defect. This article reviews the differ...

  15. Bayesian Analysis of Experimental Data

    Directory of Open Access Journals (Sweden)

    Lalmohan Bhar

    2013-10-01

    Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.

  16. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  17. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in...

  18. ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY

    Directory of Open Access Journals (Sweden)

    Felipe Schneider Costa

    2013-01-01

    Full Text Available The naïve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naïve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naïve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naïve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.

  19. Bayesian Agglomerative Clustering with Coalescents

    OpenAIRE

    Teh, Yee Whye; Daumé III, Hal; Roy, Daniel

    2009-01-01

    We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.

  20. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  1. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  2. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  3. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  4. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...

  5. Reconstructing Galaxy Histories from Globular Clusters

    CERN Document Server

    West, M J; Marzke, R O; Jordan, A; West, Michael J.; Cote, Patrick; Marzke, Ronald O.; Jordan, Andres

    2004-01-01

    Nearly a century after the true nature of galaxies as distant "island universes" was established, their origin and evolution remain great unsolved problems of modern astrophysics. One of the most promising ways to investigate galaxy formation is to study the ubiquitous globular star clusters that surround most galaxies. Recent advances in our understanding of the globular cluster systems of the Milky Way and other galaxies point to a complex picture of galaxy genesis driven by cannibalism, collisions, bursts of star formation and other tumultuous events.

  6. RECONSTRUCTING THE HISTORY OF HARAPPAN CIVILIZATION

    OpenAIRE

    Vahia, Mayank; Yadav, Nisha

    2011-01-01

    The Harappan Civilization (HC) was spread over large parts of western region of the Indian Subcontinent. Its earliest roots can be found from 7000 BC in Mehrgarh but its peak urban period is around 2500 to 1900 BC. It declined completely by 1300 BC. At its peak, it covered more than 30 per cent of the present landmass of the Indian Subcontinent. Its entire evidence is archaeological. It is classified as proto-historic since in the absence of deciphered written records it is not possible to cr...

  7. Reconstructing the Limfjord’s history

    DEFF Research Database (Denmark)

    Philippsen, Bente

    dating of shells covering the period from c. 6000BC to c. 1000AD. An age model for the sediment core, based on radiocarbon dating of terrestrial material, is constructed, assigning an age to each centimetre of depth. d13C values and C/N ratios of bulk sediment are measured. The correlation between...... water varied with time, the reservoir age for the Limfjord is also expected to vary. Radiocarbon datings of shells, compared to the age model based on terrestrial samples, resulted in reservoir ages that differed from the model ocean by between -150 and +320 years. In addition to using this information...... as an indicator for the Limfjord’s environment, our results may furthermore shed light on the magnitude of reservoir age corrections to be applied when dating marine derived archaeological samples in this region. The stable isotope and radiocarbon data will briefly be compared to the results from ongoing...

  8. Western Tasmania - A reconstructed history of wide-spread aerial pollution in a formerly 'pristine' area - the use of 210Pb and 22Ra in retrospective monitoring of the environment

    International Nuclear Information System (INIS)

    Full text: Using nuclear dating techniques and trace metal analysis of sediment cores an environmental history of Western Tasmania was reconstructed. Seven sites were selected to encompass a range of environments from highly human impacted to relatively pristine. They include sub-alpine tarns and coastal lowland lakes. Disturbed areas have been impacted by activities associated with logging, mining and colonial settlement while the near-pristine sites were located in areas with little disturbance, such as the Tasmanian Wilderness World Heritage Area. Lead-210 (210Pb) and radium-226 (226Ra), both naturally occurring radioisotopes, were used to determine sediment accumulation rates and establish chronologies. Sediment cores collected from near pristine lakes were expected to reveal low and relatively constant trace metal concentrations consistent with areas subject to little to no human impact. However, evidence from these sediment cores revealed trace metal concentrations peaked in the 1960s and then began to decrease in the 1980s. This trend was also discovered, to a greater extent, in sediment cores collected from human impacted sites particularly those surrounding the Central Western mining area. Of all the metals investigated, lead (Pb), arsenic (As), tin (Sn) and copper (Cu) were found to show the most marked increases. Temporal increases in metal concentrations were found to be a result of mining activities in Central Western Tasmania. Evidence for the most significant increase as shown by the trace metal profile coincided with the escalation of open cut mining while decreases in metal concentrations around 1980 coincided with the cessation of mining. Spatially, the dispersal was predominantly due to aerial pollution as concentrations of Pb, As, Sn and Cu were highest close to the mining areas although sites as far as 150 kilometres away showed marked metal concentration increases above background levels around 1960. (author)

  9. Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods

    OpenAIRE

    Zhu, Weixuan

    2016-01-01

    The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...

  10. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne;

    2012-01-01

    We evaluated the association between radiation therapy and severe capsular contracture or reoperation after 717 delayed breast implant reconstruction procedures (288 1- and 429 2-stage procedures) identified in the prospective database of the Danish Registry for Plastic Surgery of the Breast during...... the period between 1999 and 2006. A history of radiation therapy was associated with increased risk of severe capsular contracture for 1- and 2-stage procedures, with adjusted hazard ratios (HR) of 3.3 (95% confidence interval [CI]: 0.9-12.4) and 7.2 (95% CI: 2.4-21.4), respectively. Similarly, a...... history of radiation therapy was associated with a non-significantly increased risk of reoperation after both 1-stage (HR = 1.4; 95% CI: 0.7-2.5) and 2-stage (HR = 1.6; 95% CI: 0.9-3.1) procedures. Reconstruction failure was highest (13.2%) in the 2-stage procedures with a history of radiation therapy...

  11. Paleoenvironmental reconstruction using microfossils : terrestrial

    International Nuclear Information System (INIS)

    The following section provides a brief summary of the use of pollen and chironomids in terrestrial paleoenvironmental reconstruction and lists some key references and general text addressing the history, methods and development of these techniques in Quaternary paleoclimate research and their application in New Zealand. (author). 52 refs., 8 figs

  12. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  13. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... settings, where cues shape expectations about a small number of upcoming stimuli and thus convey "prior" information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its......The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models of...

  14. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion......The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept of...... condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators. This...

  15. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  16. Bayesian Seismology of the Sun

    CERN Document Server

    Gruberbauer, Michael

    2013-01-01

    We perform a Bayesian grid-based analysis of the solar l=0,1,2 and 3 p modes obtained via BiSON in order to deliver the first Bayesian asteroseismic analysis of the solar composition problem. We do not find decisive evidence to prefer either of the contending chemical compositions, although the revised solar abundances (AGSS09) are more probable in general. We do find indications for systematic problems in standard stellar evolution models, unrelated to the consequences of inadequate modelling of the outer layers on the higher-order modes. The seismic observables are best fit by solar models that are several hundred million years older than the meteoritic age of the Sun. Similarly, meteoritic age calibrated models do not adequately reproduce the observed seismic observables. Our results suggest that these problems will affect any asteroseismic inference that relies on a calibration to the Sun.

  17. Bayesian priors for transiting planets

    CERN Document Server

    Kipping, David M

    2016-01-01

    As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...

  18. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  19. Bayesian inference on proportional elections.

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  20. A Bayesian Nonparametric IRT Model

    OpenAIRE

    Karabatsos, George

    2015-01-01

    This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...

  1. Bayesian segmentation of hyperspectral images

    CERN Document Server

    Mohammadpour, Adel; Mohammad-Djafari, Ali

    2007-01-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  2. Bayesian segmentation of hyperspectral images

    Science.gov (United States)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  3. Bayesian Stable Isotope Mixing Models

    OpenAIRE

    Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard

    2012-01-01

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...

  4. Bayesian Network--Response Regression

    OpenAIRE

    WANG, LU; Durante, Daniele; Dunson, David B.

    2016-01-01

    There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...

  5. Bayesian estimation of turbulent motion

    OpenAIRE

    Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni

    2013-01-01

    International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...

  6. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  7. Skill Rating by Bayesian Inference

    OpenAIRE

    Di Fatta, Giuseppe; Haworth, Guy McCrossan; Regan, Kenneth W.

    2009-01-01

    Systems Engineering often involves computer modelling the behaviour of proposed systems and their components. Where a component is human, fallibility must be modelled by a stochastic agent. The identification of a model of decision-making over quantifiable options is investigated using the game-domain of Chess. Bayesian methods are used to infer the distribution of players’ skill levels from the moves they play rather than from their competitive results. The approach is used on large sets of ...

  8. Topics in Nonparametric Bayesian Statistics

    OpenAIRE

    2003-01-01

    The intersection set of Bayesian and nonparametric statistics was almost empty until about 1973, but now seems to be growing at a healthy rate. This chapter gives an overview of various theoretical and applied research themes inside this field, partly complementing and extending recent reviews of Dey, Müller and Sinha (1998) and Walker, Damien, Laud and Smith (1999). The intention is not to be complete or exhaustive, but rather to touch on research areas of interest, partly by example.

  9. Cover Tree Bayesian Reinforcement Learning

    OpenAIRE

    Tziortziotis, Nikolaos; Dimitrakakis, Christos; Blekas, Konstantinos

    2013-01-01

    This paper proposes an online tree-based Bayesian approach for reinforcement learning. For inference, we employ a generalised context tree model. This defines a distribution on multivariate Gaussian piecewise-linear models, which can be updated in closed form. The tree structure itself is constructed using the cover tree method, which remains efficient in high dimensional spaces. We combine the model with Thompson sampling and approximate dynamic programming to obtain effective exploration po...

  10. Bayesian kinematic earthquake source models

    Science.gov (United States)

    Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.

    2009-12-01

    Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high

  11. Bayesian Kernel Mixtures for Counts

    OpenAIRE

    Canale, Antonio; David B Dunson

    2011-01-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...

  12. Bayesian Optimization for Adaptive MCMC

    OpenAIRE

    Mahendran, Nimalan; Wang, Ziyu; Hamze, Firas; De Freitas, Nando

    2011-01-01

    This paper proposes a new randomized strategy for adaptive MCMC using Bayesian optimization. This approach applies to non-differentiable objective functions and trades off exploration and exploitation to reduce the number of potentially costly objective function evaluations. We demonstrate the strategy in the complex setting of sampling from constrained, discrete and densely connected probabilistic graphical models where, for each variation of the problem, one needs to adjust the parameters o...

  13. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;

    2009-01-01

    and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  14. Quantile pyramids for Bayesian nonparametrics

    OpenAIRE

    2009-01-01

    P\\'{o}lya trees fix partitions and use random probabilities in order to construct random probability measures. With quantile pyramids we instead fix probabilities and use random partitions. For nonparametric Bayesian inference we use a prior which supports piecewise linear quantile functions, based on the need to work with a finite set of partitions, yet we show that the limiting version of the prior exists. We also discuss and investigate an alternative model based on the so-called substitut...

  15. Bayesian analysis of contingency tables

    OpenAIRE

    Gómez Villegas, Miguel A.; González Pérez, Beatriz

    2005-01-01

    The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...

  16. Bayesian Credit Ratings (new version)

    OpenAIRE

    Paola Cerchiello; Paolo Giudici

    2013-01-01

    In this contribution we aim at improving ordinal variable selection in the context of causal models. In this regard, we propose an approach that provides a formal inferential tool to compare the explanatory power of each covariate, and, therefore, to select an effective model for classification purposes. Our proposed model is Bayesian nonparametric, and, thus, keeps the amount of model specification to a minimum. We consider the case in which information from the covariates is at the ordinal ...

  17. Bayesian second law of thermodynamics

    Science.gov (United States)

    Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  18. Quantum Inference on Bayesian Networks

    Science.gov (United States)

    Yoder, Theodore; Low, Guang Hao; Chuang, Isaac

    2014-03-01

    Because quantum physics is naturally probabilistic, it seems reasonable to expect physical systems to describe probabilities and their evolution in a natural fashion. Here, we use quantum computation to speedup sampling from a graphical probability model, the Bayesian network. A specialization of this sampling problem is approximate Bayesian inference, where the distribution on query variables is sampled given the values e of evidence variables. Inference is a key part of modern machine learning and artificial intelligence tasks, but is known to be NP-hard. Classically, a single unbiased sample is obtained from a Bayesian network on n variables with at most m parents per node in time (nmP(e) - 1 / 2) , depending critically on P(e) , the probability the evidence might occur in the first place. However, by implementing a quantum version of rejection sampling, we obtain a square-root speedup, taking (n2m P(e) -1/2) time per sample. The speedup is the result of amplitude amplification, which is proving to be broadly applicable in sampling and machine learning tasks. In particular, we provide an explicit and efficient circuit construction that implements the algorithm without the need for oracle access.

  19. Reconstruction within the Zeldovich approximation

    CERN Document Server

    White, Martin

    2015-01-01

    The Zeldovich approximation, 1st order Lagrangian perturbation theory, provides a good description of the clustering of matter and galaxies on large scales. The acoustic feature in the large-scale correlation function of galaxies imprinted by sound waves in the early Universe has been successfully used as a `standard ruler' to constrain the expansion history of the Universe. The standard ruler can be improved if a process known as density field reconstruction is employed. In this paper we develop the Zeldovich formalism to compute the correlation function of biased tracers in both real- and redshift-space using the simplest reconstruction algorithm with a Gaussian kernel and compare to N-body simulations. The model qualitatively describes the effects of reconstruction on the simulations, though its quantitative success depends upon how redshift-space distortions are handled in the reconstruction algorithm.

  20. Adaptive decoding for brain-machine interfaces through Bayesian parameter updates.

    Science.gov (United States)

    Li, Zheng; O'Doherty, Joseph E; Lebedev, Mikhail A; Nicolelis, Miguel A L

    2011-12-01

    Brain-machine interfaces (BMIs) transform the activity of neurons recorded in motor areas of the brain into movements of external actuators. Representation of movements by neuronal populations varies over time, during both voluntary limb movements and movements controlled through BMIs, due to motor learning, neuronal plasticity, and instability in recordings. To ensure accurate BMI performance over long time spans, BMI decoders must adapt to these changes. We propose the Bayesian regression self-training method for updating the parameters of an unscented Kalman filter decoder. This novel paradigm uses the decoder's output to periodically update its neuronal tuning model in a Bayesian linear regression. We use two previously known statistical formulations of Bayesian linear regression: a joint formulation, which allows fast and exact inference, and a factorized formulation, which allows the addition and temporary omission of neurons from updates but requires approximate variational inference. To evaluate these methods, we performed offline reconstructions and closed-loop experiments with rhesus monkeys implanted cortically with microwire electrodes. Offline reconstructions used data recorded in areas M1, S1, PMd, SMA, and PP of three monkeys while they controlled a cursor using a handheld joystick. The Bayesian regression self-training updates significantly improved the accuracy of offline reconstructions compared to the same decoder without updates. We performed 11 sessions of real-time, closed-loop experiments with a monkey implanted in areas M1 and S1. These sessions spanned 29 days. The monkey controlled the cursor using the decoder with and without updates. The updates maintained control accuracy and did not require information about monkey hand movements, assumptions about desired movements, or knowledge of the intended movement goals as training signals. These results indicate that Bayesian regression self-training can maintain BMI control accuracy over long

  1. Project Reconstruct.

    Science.gov (United States)

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  2. Vaginal reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Lesavoy, M.A.

    1985-05-01

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients.

  3. Vaginal reconstruction

    International Nuclear Information System (INIS)

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients

  4. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  5. Denoising Message Passing for X-ray Computed Tomography Reconstruction

    CERN Document Server

    Perelli, Alessandro; Can, Ali; Davies, Mike E

    2016-01-01

    X-ray Computed Tomography (CT) reconstruction from sparse number of views is becoming a powerful way to reduce either the radiation dose or the acquisition time in CT systems but still requires a huge computational time. This paper introduces an approximate Bayesian inference framework for CT reconstruction based on a family of denoising approximate message passing (DCT-AMP) algorithms able to improve both the convergence speed and the reconstruction quality. Approximate Message Passing for Compressed Sensing has been extensively analysed for random linear measurements but there are still not clear solutions on how AMP should be modified and how it performs with real world problems. In particular to overcome the convergence issues of DCT-AMP with structured measurement matrices, we propose a disjoint preconditioned version of the algorithm tailored for both the geometric system model and the noise model. In addition the Bayesian DCT-AMP formulation allows to measure how the current estimate is close to the pr...

  6. ACL reconstruction - discharge

    Science.gov (United States)

    Anterior cruciate ligament reconstruction - discharge; ACL reconstruction - discharge ... had surgery to reconstruct your anterior cruciate ligament (ACL). The surgeon drilled holes in the bones of ...

  7. Bayesian probabilities of earthquake occurrences in Longmenshan fault system (China)

    Science.gov (United States)

    Wang, Ying; Zhang, Keyin; Gan, Qigang; Zhou, Wen; Xiong, Liang; Zhang, Shihua; Liu, Chao

    2015-01-01

    China has a long history of earthquake records, and the Longmenshan fault system (LFS) is a famous earthquake zone. We believed that the LFS could be divided into three seismogenic zones (north, central, and south zones) based on the geological structures and the earthquake catalog. We applied the Bayesian probability method using extreme-value distribution of earthquake occurrences to estimate the seismic hazard in the LFS. The seismic moment, slip rate, earthquake recurrence rate, and magnitude were considered as the basic parameters for computing the Bayesian prior estimates of the seismicity. These estimates were then updated in terms of Bayes' theorem and historical estimates of seismicity in the LFS. Generally speaking, the north zone seemingly is quite peaceful compared with the central and south zones. The central zone is the most dangerous; however, the periodicity of earthquake occurrences for M s = 8.0 is quite long (1,250 to 5,000 years). The selection of upper bound probable magnitude influences the result, and the upper bound magnitude of the south zone maybe 7.5. We obtained the empirical relationship of magnitude conversion for M s and ML, the values of the magnitude of completeness Mc (3.5), and the Gutenberg-Richter b value before applying the Bayesian extreme-value distribution of earthquake occurrences method.

  8. Fracture prediction of cardiac lead medical devices using Bayesian networks

    International Nuclear Information System (INIS)

    A novel Bayesian network methodology has been developed to enable the prediction of fatigue fracture of cardiac lead medical devices. The methodology integrates in-vivo device loading measurements, patient demographics, patient activity level, in-vitro fatigue strength measurements, and cumulative damage modeling techniques. Many plausible combinations of these variables can be simulated within a Bayesian network framework to generate a family of fatigue fracture survival curves, enabling sensitivity analyses and the construction of confidence bounds on reliability predictions. The method was applied to the prediction of conductor fatigue fracture near the shoulder for two market-released cardiac defibrillation leads which had different product performance histories. The case study used recently published data describing the in-vivo curvature conditions and the in-vitro fatigue strength. The prediction results from the methodology aligned well with the observed qualitative ranking of field performance, as well as the quantitative field survival from fracture. This initial success suggests that study of further extension of this method to other medical device applications is warranted. - Highlights: • A new method to simulate the fatigue experience of an implanted cardiac lead. • Fatigue strength and use conditions are incorporated within a Bayesian network. • Confidence bounds reflect the uncertainty in all input parameters. • A case study is presented using market released cardiac leads

  9. Bayesian Posterior Distributions Without Markov Chains

    OpenAIRE

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.

    2012-01-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...

  10. Bayesian networks with applications in reliability analysis

    OpenAIRE

    Langseth, Helge

    2002-01-01

    A common goal of the papers in this thesis is to propose, formalize and exemplify the use of Bayesian networks as a modelling tool in reliability analysis. The papers span work in which Bayesian networks are merely used as a modelling tool (Paper I), work where models are specially designed to utilize the inference algorithms of Bayesian networks (Paper II and Paper III), and work where the focus has been on extending the applicability of Bayesian networks to very large domains (Paper IV and ...

  11. 苏北盆地金湖凹陷热史与成藏期判识%Thermal History Reconstruction and Hydrocarbon Accumulation Period Discrimination of Jinhu Depression in Subei Basin

    Institute of Scientific and Technical Information of China (English)

    李亚军; 李儒峰; 陈莉琼; 宋宁; 方晶

    2011-01-01

    Based on the analysis of the vitrinite reflectance and apatite fission track inclusions system testing, we carried out the calculation of paleotemperature gradient and reconstruction of thermal history , and then identified the paleotemperature gradient of west slope and Bianminyang tectonic zone of Jinhu depression. According to the vitrinite reflectance, we calculated that the range of the paleotemperature being between 45.6 ~128.4℃ and the paleotemperature gradient was 45.5 ℃/km in the west slope, the paleotemperature in Bianminyang tectonic zone was 26.4 ~120.3℃ and the paleotemperature gradient was 42.7℃/km. According to the apatite fission track, we calculated that the paleotemperature gradient in the west slope was 40.7℃/km, and in Bianminyang tectonic zone was 45.8℃/km.From the comparative analysis with different tectonic zones of Jinhu depression, we concluded a law that the paleo temperature gradient was higher than present-day geothermal gradient, specifically as follows: in the west slope, paleotemperature was 10.4 ~ 15.2℃/kin higher than the current, and in Bianminyang tectonic zone paleotemperature was 12.4 ~ 15.3℃/km higher than the present. By the thermal history modeling of typical wells in west slope and Bianminyang tectonic zone, it could be seen that the paleo-geothermal gradient became lower with the stratigraphical time changed for the new. It shows that the geothermal gradient of K2t ~ E1fwas higher than E2d ~ Ny. Before the uplift and erosion caused by the Sanduo tectonic events, the paleo-temperature of depression had reached the maximum. The maturity history of depression reflected that the R0 was 0. 4% in the depth of 1 000 m of Jinhu depression. The source rock was at the low-mature stage. The R0 was 0.65% in the depth of 1 900 m and the temperature reached 90℃, the hydrocarbon source rocks entered the peak phase. The homogenization temperature of fluid inclusion samples was between 62 ~ 93℃ of Well

  12. Radio Reconstructions

    OpenAIRE

    Bulley, James; Jones, Daniel

    2013-01-01

    Radio Reconstructions is a sound installation which use indeterminate radio broadcasts as its raw material. Each piece is structured by a notated score, which controls its rhythm, dynamics and melodic contour over time. The audio elements used to enact this score are selected in real-time from unknown radio transmissions, by an autonomous software system which is continuously scanning the radio waves in search of similar fragments of audio. Using a technique known as audio mosaicing, hund...

  13. Ultra-wideband microwave imaging of breast cancer tumors via Bayesian inverse scattering

    Science.gov (United States)

    Fouda, A. E.; Teixeira, F. L.

    2014-02-01

    We develop a new algorithm for ultra-wideband (UWB) microwave imaging of breast cancer tumors using Bayesian inverse scattering. A key feature of the proposed algorithm is that constitutive properties of breast tissues are reconstructed from scattered UWB microwave signals together with the confidence level of the reconstruction. Having such confidence level enables minimization of both false alarms and missed detections. Results from the application of the proposed algorithm demonstrate the accuracy in estimating both location and permittivity of breast tumors without the need for a priori knowledge of pointwise properties of the background breast tissue.

  14. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  15. Bayesian Query-Focused Summarization

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.

  16. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  17. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  18. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  19. Collaborative Kalman Filtration: Bayesian Perspective

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil

    Lisabon, Portugalsko: Institute for Systems and Technologies of Information, Control and Communication (INSTICC), 2014, s. 468-474. ISBN 978-989-758-039-0. [11th International Conference on Informatics in Control, Automation and Robotics - ICINCO 2014. Vien (AT), 01.09.2014-03.09.2014] R&D Projects: GA ČR(CZ) GP14-06678P Institutional support: RVO:67985556 Keywords : Bayesian analysis * Kalman filter * distributed estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/dedecius-0431324.pdf

  20. Bohmian Histories and Decoherent Histories

    OpenAIRE

    Hartle, James B.

    2002-01-01

    The predictions of the Bohmian and the decoherent (or consistent) histories formulations of the quantum mechanics of a closed system are compared for histories -- sequences of alternatives at a series of times. For certain kinds of histories, Bohmian mechanics and decoherent histories may both be formulated in the same mathematical framework within which they can be compared. In that framework, Bohmian mechanics and decoherent histories represent a given history by different operators. Their ...

  1. Bayesian credible interval construction for Poisson statistics

    Institute of Scientific and Technical Information of China (English)

    ZHU Yong-Sheng

    2008-01-01

    The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.

  2. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  3. Bayesian Statistics for Biological Data: Pedigree Analysis

    Science.gov (United States)

    Stanfield, William D.; Carlton, Matthew A.

    2004-01-01

    The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.

  4. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  5. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  6. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and ...

  7. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by eva...

  8. Bayesian analysis of exoplanet and binary orbits

    CERN Document Server

    Schulze-Hartung, Tim; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  9. Computational methods for Bayesian model choice

    OpenAIRE

    Robert, Christian P.; Wraith, Darren

    2009-01-01

    In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.

  10. Internal dosimetry of uranium isotopes using bayesian inference methods

    International Nuclear Information System (INIS)

    A group of personnel at Los Alamos National Laboratory is routinely monitored for the presence of uranium isotopes by urine bioassay. Samples are analysed by alpha spectroscopy, and the results are examined for evidence of an intake of uranium. Because the measurement uncertainties are often comparable to the quantities of material we wish to detect, statistical considerations are crucial for the proper interpretation of the data. The problem is further complicated by the significant, but highly non-uniform, presence of uranium in local drinking water and, in some cases, food supply. Software originally developed for internal dosimetry of plutonium has been adapted to the problem of uranium dosimetry. The software uses an unfolding algorithm to calculate an approximate Bayesian solution to the problem of characterising any intakes which may have occurred, given the history of urine bioassay results for each individual in the monitored population. The program uses biokinetic models from ICRP Publications 68 and later, and a prior probability distribution derived empirically from the body of uranium bioassay data collected at Los Alamos over the operating history of the Laboratory. For each individual, the software creates a posterior probability distribution of intake quantity and solubility type as a function of time. From this distribution, estimates are made of the cumulative committed dose (CEDE) to each individual. Results of the method are compared with those obtained using an earlier classical (non-Bayesian) algorithm for uranium dosimetry. We also discuss the problem of distinguishing occupational intakes from intake of environmental uranium, within a Bayesian framework. (author)

  11. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  12. Online query answering with differential privacy: a utility-driven approach using Bayesian inference

    CERN Document Server

    Xiao, Yonghui

    2012-01-01

    Data privacy issues frequently and increasingly arise for data sharing and data analysis tasks. In this paper, we study the problem of online query answering under the rigorous differential privacy model. The existing interactive mechanisms for differential privacy can only support a limited number of queries before the accumulated cost of privacy reaches a certain bound. This limitation has greatly hindered their applicability, especially in the scenario where multiple users legitimately need to pose a large number of queries. To minimize the privacy cost and extend the life span of a system, we propose a utility-driven mechanism for online query answering using Bayesian statistical inference. The key idea is to keep track of the query history and use Bayesian inference to answer a new query using previous query answers. The Bayesian inference algorithm provides both optimal point estimation and optimal interval estimation. We formally quantify the error of the inference result to determine if it satisfies t...

  13. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data. PMID:26776199

  14. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  15. Divergence history of the Carpathian and smooth newts modelled in space and time.

    Science.gov (United States)

    Zieliński, P; Nadachowska-Brzyska, K; Dudek, K; Babik, W

    2016-08-01

    Information about demographic history is essential for the understanding of the processes of divergence and speciation. Patterns of genetic variation within and between closely related species provide insights into the history of their interactions. Here, we investigated historical demography and genetic exchange between the Carpathian (Lissotriton montandoni, Lm) and smooth (L. vulgaris, Lv) newts. We combine an extensive geographical sampling and multilocus nuclear sequence data with the approximate Bayesian computation framework to test alternative scenarios of divergence and reconstruct the temporal and spatial pattern of gene flow between species. A model of recent (last glacial period) interspecific gene flow was favoured over alternative models. Thus, despite the relatively old divergence (4-6 mya) and presumably long periods of isolation, the species have retained the ability to exchange genes. Nevertheless, the low migration rates (ca. 10(-6) per gene copy per generation) are consistent with strong reproductive isolation between the species. Models allowing demographic changes were favoured, suggesting that the effective population sizes of both species at least doubled as divergence reaching the current ca. 0.2 million in Lm and 1 million in Lv. We found asymmetry in rates of interspecific gene flow between Lm and one evolutionary lineage of Lv. We suggest that intraspecific polymorphism for hybrid incompatibilities segregating within Lv could explain this pattern and propose further tests to distinguish between alternative explanations. Our study highlights the importance of incorporating intraspecific genetic structure into the models investigating the history of divergence. PMID:27288862

  16. Demographic history and gene flow during silkworm domestication

    OpenAIRE

    Yang, Shao-Yu; Han, Min-Jin; Kang, Li-Fang; Li, Zi-Wen; Shen, Yi-Hong; Zhang, Ze

    2014-01-01

    Background Gene flow plays an important role in domestication history of domesticated species. However, little is known about the demographic history of domesticated silkworm involving gene flow with its wild relative. Results In this study, four model-based evolutionary scenarios to describe the demographic history of B. mori were hypothesized. Using Approximate Bayesian Computation method and DNA sequence data from 29 nuclear loci, we found that the gene flow at bottleneck model is the most...

  17. Tracheal reconstructions.

    Science.gov (United States)

    Srikrishna, S V; Shekar, P S; Shetty, N

    1998-12-01

    Surgical reconstruction of the trachea is a relatively complex procedure. We had 20 cases of tracheal stenosis. We have a modest experience of 16 tracheal reconstructions for acquired tracheal stenosis. Two patients underwent laser treatment while another two died before any intervention. The majority of these cases were a result of prolonged ventilation (14 cases), following organophosphorous poisoning (11 cases), Guillain-Barré syndrome, bullet injury, fat embolism and surprisingly only one tumor, a case of mucoepidermoid carcinoma, who had a very unusual presentation. There were 12 males and 4 females in this series, age ranging from 12-35 years. The duration of ventilation ranged from 1-21 days and the interval from decannulation to development of stridor was between 5-34 days. Six of them were approached by the cervical route, 5 by thoracotomy and cervical approach, 2 via median sternotomy and 3 by thoracotomy alone. Five of them required an additional laryngeal drop and 1 required pericardiotomy and release of pulmonary veins to gain additional length. The excised segments of trachea measured 3 to 5 cms in length. All were end to end anastomosis with interrupted Vicryl sutures. We have had no experience with stents or prosthetic tubes. Three patients developed anastomotic leaks which were controlled conservatively. Almost all of them required postoperative tracheo-bronchial suctioning with fibreoptic bronchoscope. We had one death in this series due to sepsis. PMID:9914459

  18. Unsupervised Bayesian decomposition of multiunit EMG recordings using Tabu search.

    Science.gov (United States)

    Ge, Di; Le Carpentier, Eric; Farina, Dario

    2010-03-01

    Intramuscular electromyography (EMG) signals are usually decomposed with semiautomatic procedures that involve the interaction with an expert operator. In this paper, a Bayesian statistical model and a maximum a posteriori (MAP) estimator are used to solve the problem of multiunit EMG decomposition in a fully automatic way. The MAP estimation exploits both the likelihood of the reconstructed EMG signal and some physiological constraints, such as the discharge pattern regularity and the refractory period of muscle fibers, as prior information integrated in a Bayesian framework. A Tabu search is proposed to efficiently tackle the nondeterministic polynomial-time-hard problem of optimization w.r.t the motor unit discharge patterns. The method is fully automatic and was tested on simulated and experimental EMG signals. Compared with the semiautomatic decomposition performed by an expert operator, the proposed method resulted in an accuracy of 90.0% +/- 3.8% when decomposing single-channel intramuscular EMG signals recorded from the abductor digiti minimi muscle at contraction forces of 5% and 10% of the maximal force. The method can also be applied to the automatic identification and classification of spikes from other neural recordings. PMID:19457743

  19. Group Tracking of Space Objects within Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Huang Jian

    2013-03-01

    Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.

  20. Bayesian noise estimation for non-ideal CMB experiments

    CERN Document Server

    Wehus, I K; Eriksen, H K

    2011-01-01

    We describe a Bayesian framework for estimating the time-domain noise covariance of CMB observations, typically parametrized in terms of a 1/f frequency profile. This framework is based on the Gibbs sampling algorithm, which allows for exact marginalization over nuisance parameters through conditional probability distributions. In this paper we implement support for gaps in the data streams and marginalization over fixed time-domain templates, and also outline how to marginalize over confusion from CMB fluctuations, which may be important for high signal-to-noise experiments. As a by-product of the method, we obtain proper constrained realizations, which themselves can be useful for map making. To validate the algorithm, we demonstrate that the reconstructed noise parameters and corresponding uncertainties are unbiased using simulated data. The CPU time required to process a single data stream of 100 000 samples with 1000 samples removed by gaps is 3 seconds if only the maximum posterior parameters are requir...

  1. Bayesian redshift-space distortions correction from galaxy redshift surveys

    CERN Document Server

    Kitaura, Francisco-Shu; Angulo, Raul E; Chuang, Chia-Hsun; Rodriguez-Torres, Sergio; Monteagudo, Carlos Hernandez; Prada, Francisco; Yepes, Gustavo

    2015-01-01

    We present a Bayesian reconstruction method which maps a galaxy distribution from redshift-space to real-space inferring the distances of the individual galaxies. The method is based on sampling density fields assuming a lognormal prior with a likelihood given by the negative binomial distribution function modelling stochastic bias. We assume a deterministic bias given by a power law relating the dark matter density field to the expected halo or galaxy field. Coherent redshift-space distortions are corrected in a Gibbs-sampling procedure by moving the galaxies from redshift-space to real-space according to the peculiar motions derived from the recovered density field using linear theory with the option to include tidal field corrections from second order Lagrangian perturbation theory. The virialised distortions are corrected by sampling candidate real-space positions (being in the neighbourhood of the observations along the line of sight), which are compatible with the bulk flow corrected redshift-space posi...

  2. A Bayesian analysis of regularised source inversions in gravitational lensing

    CERN Document Server

    Suyu, S H; Hobson, M P; Marshall, P J

    2006-01-01

    Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...

  3. Pediatric anterior cruciate ligament reconstruction

    OpenAIRE

    McConkey, Mark O.; Bonasia, Davide Edoardo; Amendola, Annunziato

    2011-01-01

    An increasing number of anterior cruciate ligament (ACL) injuries are seen in children now than in the past due to increased sports participation. The natural history of ACL deficient knees in active individuals, particularly in children is poor. Surgical management of ACL deficiency in children is complex due to the potential risk of injury to the physis and growth disturbance. Delaying ACL reconstruction until maturity is possible but risks instability episodes and intra-articular damage. S...

  4. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  5. Quantum Bayesianism at the Perimeter

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.

  6. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437

  7. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  8. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  9. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  10. Bayesian anti-sparse coding

    CERN Document Server

    Elvira, Clément; Dobigeon, Nicolas

    2015-01-01

    Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...

  11. State Information in Bayesian Games

    CERN Document Server

    Cuff, Paul

    2009-01-01

    Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.

  12. An intake prior for the bayesian analysis of plutonium and uranium exposures in an epidemiology study

    International Nuclear Information System (INIS)

    In Bayesian inference, the initial knowledge regarding the value of a parameter, before additional data are considered, is represented as a prior probability distribution. This paper describes the derivation of a prior distribution of intake that was used for the Bayesian analysis of plutonium and uranium worker doses in a recent epidemiology study. The chosen distribution is log- normal with a geometric standard deviation of 6 and a median value that is derived for each worker based on the duration of the work history and the number of reported acute intakes. The median value is a function of the work history and a constant related to activity in air concentration, M, which is derived separately for uranium and plutonium. The value of M is based primarily on measurements of plutonium and uranium in air derived from historical personal air sampler (PAS) data. However, there is significant uncertainty on the value of M that results from paucity of PAS data and from extrapolating these measurements to actual intakes. This paper compares posterior and prior distributions of intake and investigates the sensitivity of the Bayesian analyses to the assumed value of M. It is found that varying M by a factor of 10 results in a much smaller factor of 2 variation in mean intake and lung dose for both plutonium and uranium. It is concluded that if a log-normal distribution is considered to adequately represent worker intakes, then the Bayesian posterior distribution of dose is relatively insensitive to the value assumed of M. (authors)

  13. Tracking composite material damage evolution using Bayesian filtering and flash thermography data

    Science.gov (United States)

    Gregory, Elizabeth D.; Holland, Steve D.

    2016-05-01

    We propose a method for tracking the condition of a composite part using Bayesian filtering of ash thermography data over the lifetime of the part. In this demonstration, composite panels were fabricated; impacted to induce subsurface delaminations; and loaded in compression over multiple time steps, causing the delaminations to grow in size. Flash thermography data was collected between each damage event to serve as a time history of the part. The ash thermography indicated some areas of damage but provided little additional information as to the exact nature or depth of the damage. Computed tomography (CT) data was also collected after each damage event and provided a high resolution volume model of damage that acted as truth. After each cycle, the condition estimate, from the ash thermography data and the Bayesian filter, was compared to 'ground truth'. The Bayesian process builds on the lifetime history of ash thermography scans and can give better estimates of material condition as compared to the most recent scan alone, which is common practice in the aerospace industry. Bayesian inference provides probabilistic estimates of damage condition that are updated as each new set of data becomes available. The method was tested on simulated data and then on an experimental data set.

  14. Cooperative extensions of the Bayesian game

    CERN Document Server

    Ichiishi, Tatsuro

    2006-01-01

    This is the very first comprehensive monograph in a burgeoning, new research area - the theory of cooperative game with incomplete information with emphasis on the solution concept of Bayesian incentive compatible strong equilibrium that encompasses the concept of the Bayesian incentive compatible core. Built upon the concepts and techniques in the classical static cooperative game theory and in the non-cooperative Bayesian game theory, the theory constructs and analyzes in part the powerful n -person game-theoretical model characterized by coordinated strategy-choice with individualistic ince

  15. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  16. Supra-Bayesian Combination of Probability Distributions

    Czech Academy of Sciences Publication Activity Database

    Sečkárová, Vladimíra

    Veszprém : University of Pannonia, 2010, s. 112-117. ISBN 978-615-5044-00-7. [11th International PhD Workshop on Systems and Control. Veszprém (HU), 01.09.2010-03.09.2010] R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Supra-Bayesian approach * sharing of probabilistic information * Bayesian decision making Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2010/AS/seckarova-supra-bayesian combination of probability distributions.pdf

  17. Bayesian Soft Sensing in Cold Sheet Rolling

    Czech Academy of Sciences Publication Activity Database

    Dedecius, Kamil; Jirsa, Ladislav

    Praha: ÚTIA AV ČR, v.v.i, 2010. s. 45-45. [6th International Workshop on Data–Algorithms–Decision Making. 2.12.2010-4.12.2010, Jindřichův Hradec] R&D Projects: GA MŠk(CZ) 7D09008 Institutional research plan: CEZ:AV0Z10750506 Keywords : soft sensor * bayesian statistics * bayesian model averaging Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/AS/dedecius-bayesian soft sensing in cold sheet rolling.pdf

  18. Family History

    Science.gov (United States)

    ... Brain Aneurysm Statistics and Facts Seeking Medical Attention Pediatric Aneurysms Brain Aneurysm Causes and Risk Factors Family History ... Brain Aneurysm Statistics and Facts Seeking Medical Attention Pediatric Aneurysms Brain Aneurysm Causes and Risk Factors Family History ...

  19. The Diagnosis of Reciprocating Machinery by Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.

  20. New insights into the hepatitis E virus genotype 3 phylodynamics and evolutionary history.

    Science.gov (United States)

    Mirazo, Santiago; Mir, Daiana; Bello, Gonzalo; Ramos, Natalia; Musto, Héctor; Arbiza, Juan

    2016-09-01

    Hepatitis E virus (HEV) is an emergent hepatotropic virus endemic mainly in Asia and other developing areas. However, in the last decade it has been increasingly reported in high-income countries. Human infecting HEV strains are currently classified into four genotypes (1-4). Genotype 3 (HEV-3) is the prevalent virus genotype and the mostly associated with autochthonous and sporadic cases of HEV in developed areas. The evolutionary history of HEV worldwide remains largely unknown. In this study we reconstructed the spatiotemporal and population dynamics of HEV-3 at global scale, but with particular emphasis in South America, where case reports have increased dramatically in the last years. To achieve this, we applied a Bayesian coalescent-based approach to a comprehensive data set comprising 97 GenBank HEV-3 sequences for which the location and sampling date was documented. Our phylogenetic analyses suggest that the worldwide genetic diversity of HEV-3 can be grouped into two main Clades (I and II) with a Ƭmrca dated in approximately 320years ago (95% HPD: 420-236years) and that a unique independent introduction of HEV-3 seems to have occurred in Uruguay, where most of the human HEV cases in South America have been described. The phylodynamic inference indicates that the population size of this virus suffered substantial temporal variations after the second half of the 20th century. In this sense and conversely to what is postulated to date, we suggest that the worldwide effective population size of HEV-3 is not decreasing and that frequently sources of error in its estimates stem from assumptions that the analyzed sequences are derived from a single panmictic population. Novel insights on the global population dynamics of HEV are given. Additionally, this work constitutes an attempt to further describe in a Bayesian coalescent framework, the phylodynamics and evolutionary history of HEV-3 in the South American region. PMID:27264728

  1. Quantum State Reconstruction From Incomplete Data

    CERN Document Server

    Buzek, V; Derka, R; Adam, G; Wiedemann, H

    1998-01-01

    Knowing and guessing, these are two essential epistemological pillars in the theory of quantum-mechanical measurement. As formulated quantum mechanics is a statistical theory. In general, a priori unknown states can be completely determined only when measurements on infinite ensembles of identically prepared quantum systems are performed. But how one can estimate (guess) quantum state when just incomplete data are available (known)? What is the most reliable estimation based on a given measured data? What is the optimal measurement providing only a finite number of identically prepared quantum objects are available? These are some of the questions we address. We present several schemes for a reconstruction of states of quantum systems from measured data: (1) We show how the maximum entropy (MaxEnt) principle can be efficiently used for an estimation of quantum states on incomplete observation levels. (2) We show how Bayesian inference can be used for reconstruction of quantum states when only a finite number ...

  2. Computational Imaging for VLBI Image Reconstruction

    CERN Document Server

    Bouman, Katherine L; Zoran, Daniel; Fish, Vincent L; Doeleman, Sheperd S; Freeman, William T

    2015-01-01

    Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other methods require careful tuning and parameter selection for different types of images, our method is robust and produces good results under different settings such as low SNR or extended emissions. The success of our method is demonstrated on realistic synthetic experiments as well as publicly available real data. We present this problem in a way that is accessible to members of the computer vision community, and provide a dataset website (vlbiimaging.csail.mit.edu) to allow for controlled comparisons across algorithms. Thi...

  3. Bayesian approach to inverse problems for functions with a variable-index Besov prior

    Science.gov (United States)

    Jia, Junxiong; Peng, Jigen; Gao, Jinghuai

    2016-08-01

    The Bayesian approach has been adopted to solve inverse problems that reconstruct a function from noisy observations. Prior measures play a key role in the Bayesian method. Hence, many probability measures have been proposed, among which total variation (TV) is a well-known prior measure that can preserve sharp edges. However, it has two drawbacks, the staircasing effect and a lack of the discretization-invariant property. The variable-index TV prior has been proposed and analyzed in the area of image analysis for the former, and the Besov prior has been employed recently for the latter. To overcome both issues together, in this paper, we present a variable-index Besov prior measure, which is a non-Gaussian measure. Some useful properties of this new prior measure have been proven for functions defined on a torus. We have also generalized Bayesian inverse theory in infinite dimensions for our new setting. Finally, this theory has been applied to integer- and fractional-order backward diffusion problems. To the best of our knowledge, this is the first time that the Bayesian approach has been used for the fractional-order backward diffusion problem, which provides an opportunity to quantify its uncertainties.

  4. CURRENT CONCEPTS IN ACL RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Freddie H. Fu

    2008-09-01

    Full Text Available Current Concepts in ACL Reconstruction is a complete reference text composed of the most thorough collection of topics on the ACL and its surgical reconstruction compiled, with contributions from some of the world's experts and most experienced ACL surgeons. Various procedures mentioned throughout the text are also demonstrated in an accompanying video CD-ROM. PURPOSE Composing a single, comprehensive and complete information source on ACL including basic sciences, clinical issues, latest concepts and surgical techniques, from evaluation to outcome, from history to future, editors and contributors have targeted to keep the audience pace with the latest concepts and techniques for the evaluation and the treatment of ACL injuries. FEATURES The text is composed of 27 chapters in 6 sections. The first section is mostly about basic sciences, also history of the ACL, imaging, clinical approach to adolescent and pediatric patients are subjected. In the second section, Graft Choices and Arthroscopy Portals for ACL Reconstruction are mentioned. The third section is about the technique and the outcome of the single-bundle ACL reconstruction. The fourth chapter includes the techniques and outcome of the double-bundle ACL reconstruction. In the fifth chapter revision, navigation technology, rehabilitation and the evaluation of the outcome of ACL reconstruction is subjected. The sixth/the last chapter is about the future advances to reach: What We Have Learned and the Future of ACL Reconstruction. AUDIENCE Orthopedic residents, sports traumatology and knee surgery fellows, orthopedic surgeons, also scientists in basic sciences or clinicians who are studying or planning a research on ACL forms the audience group of this book. ASSESSMENT This is the latest, the most complete and comprehensive textbook of ACL reconstruction produced by the editorial work up of two pioneer and masters "Freddie H. Fu MD and Steven B. Cohen MD" with the contribution of world

  5. Breast Reconstruction and Prosthesis

    Science.gov (United States)

    ... have breast reconstruction If you choose to have reconstructive surgery, follow these steps: STEP 1 — Ask your doctor to refer you to a plastic surgeon who is an expert in breast reconstruction. ...

  6. Iterative PET Image Reconstruction Using Translation Invariant Wavelet Transform.

    OpenAIRE

    Zhou, Jian; Senhadji, Lotfi; Coatrieux, Jean-Louis; Luo, Limin

    2009-01-01

    The present work describes a Bayesian maximum a posteriori (MAP) method using a statistical multiscale wavelet prior model. Rather than using the orthogonal discrete wavelet transform (DWT), this prior is built on the translation invariant wavelet transform (TIWT). The statistical modeling of wavelet coefficients relies on the generalized Gaussian distribution. Image reconstruction is performed in spatial domain with a fast block sequential iteration algorithm. We study theoretically the TIWT...

  7. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  8. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  9. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  10. Bayesian Control for Concentrating Mixed Nuclear Waste

    OpenAIRE

    Welch, Robert L.; Smith, Clayton

    2013-01-01

    A control algorithm for batch processing of mixed waste is proposed based on conditional Gaussian Bayesian networks. The network is compiled during batch staging for real-time response to sensor input.

  11. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  12. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  13. Bayesian Variable Selection in Spatial Autoregressive Models

    OpenAIRE

    Jesus Crespo Cuaresma; Philipp Piribauer

    2015-01-01

    This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...

  14. Bayesian Analysis of Multivariate Probit Models

    OpenAIRE

    Siddhartha Chib; Edward Greenberg

    1996-01-01

    This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...

  15. Kernel Bayesian Inference with Posterior Regularization

    OpenAIRE

    Song, Yang; Jun ZHU; Ren, Yong

    2016-01-01

    We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...

  16. Fitness inheritance in the Bayesian optimization algorithm

    OpenAIRE

    Pelikan, Martin; Sastry, Kumara

    2004-01-01

    This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions...

  17. Bayesian Network Models for Adaptive Testing

    Czech Academy of Sciences Publication Activity Database

    Plajner, Martin; Vomlel, Jiří

    Achen: Sun SITE Central Europe, 2016 - (Agosta, J.; Carvalho, R.), s. 24-33. (CEUR Workshop Proceedings. Vol 1565). ISSN 1613-0073. [The Twelfth UAI Bayesian Modeling Applications Workshop (BMAW 2015). Amsterdam (NL), 16.07.2015] R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Bayesian networks * Computerized adaptive testing Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/MTR/plajner-0458062.pdf

  18. Nomograms for Visualization of Naive Bayesian Classifier

    OpenAIRE

    Možina, Martin; Demšar, Janez; Michael W Kattan; Zupan, Blaz

    2004-01-01

    Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the proposed method are simplicity of presentation, clear display of the effects of individual attribute value...

  19. Subjective Bayesian Analysis: Principles and Practice

    OpenAIRE

    Goldstein, Michael

    2006-01-01

    We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.

  20. An Entropy Search Portfolio for Bayesian Optimization

    OpenAIRE

    Shahriari, Bobak; Wang, Ziyu; Hoffman, Matthew W.; Bouchard-Côté, Alexandre; De Freitas, Nando

    2014-01-01

    Bayesian optimization is a sample-efficient method for black-box global optimization. How- ever, the performance of a Bayesian optimization method very much depends on its exploration strategy, i.e. the choice of acquisition function, and it is not clear a priori which choice will result in superior performance. While portfolio methods provide an effective, principled way of combining a collection of acquisition functions, they are often based on measures of past performance which can be misl...

  1. A Bayesian Framework for Active Artificial Perception

    OpenAIRE

    Ferreira, Joao; Lobo, Jorge; Bessiere, Pierre; Castelo-Branco, M; Dias, Jorge

    2012-01-01

    In this text, we present a Bayesian framework for active multimodal perception of 3D structure and motion. The design of this framework finds its inspiration in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common egocentric spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach. In the process, we will contribute with efficient and robust probabilistic solutions for cycl...

  2. Bayesian Classification in Medicine: The Transferability Question *

    OpenAIRE

    Zagoria, Ronald J.; Reggia, James A.; Price, Thomas R.; Banko, Maryann

    1981-01-01

    Using probabilities derived from a geographically distant patient population, we applied Bayesian classification to categorize stroke patients by etiology. Performance was assessed both by error rate and with a new linear accuracy coefficient. This approach to patient classification was found to be surprisingly accurate when compared to classification by two neurologists and to classification by the Bayesian method using “low cost” local and subjective probabilities. We conclude that for some...

  3. Fuzzy Functional Dependencies and Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    LIU WeiYi(刘惟一); SONG Ning(宋宁)

    2003-01-01

    Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.

  4. Evaluation System for a Bayesian Optimization Service

    OpenAIRE

    Dewancker, Ian; McCourt, Michael; Clark, Scott; Hayes, Patrick; Johnson, Alexandra; Ke, George

    2016-01-01

    Bayesian optimization is an elegant solution to the hyperparameter optimization problem in machine learning. Building a reliable and robust Bayesian optimization service requires careful testing methodology and sound statistical analysis. In this talk we will outline our development of an evaluation framework to rigorously test and measure the impact of changes to the SigOpt optimization service. We present an overview of our evaluation system and discuss how this framework empowers our resea...

  5. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  6. Bayesian Models of Brain and Behaviour

    OpenAIRE

    Penny, William

    2012-01-01

    This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...

  7. Bayesian Approach to Handling Informative Sampling

    OpenAIRE

    Sikov, Anna

    2015-01-01

    In the case of informative sampling the sampling scheme explicitly or implicitly depends on the response variable. As a result, the sample distribution of response variable can- not be used for making inference about the population. In this research I investigate the problem of informative sampling from the Bayesian perspective. Application of the Bayesian approach permits solving the problems, which arise due to complexity of the models, being used for handling informative sampling. The main...

  8. Climate history of the Southern Hemisphere Westerlies belt during the last glacial-interglacial transition revealed from lake water oxygen isotope reconstruction of Laguna Potrok Aike (52° S, Argentina)

    OpenAIRE

    J. Zhu; A. Lücke; H. Wissel; Mayr, C.; D. Enters; Kim, K. J.; C. Ohlendorf; F. Schäbitz; B. Zolitschka

    2014-01-01

    The Southern Hemisphere westerly winds (SHW) play a crucial role in the large-scale ocean circulation and global carbon cycling. Accordingly, the reconstruction of its latitudinal position and intensity is essential for understanding global climatic fluctuations during the last glacial cycle. The southernmost part of the South American continent is of great importance for paleoclimate studies as the only continental mass intersecting a large part of the SHW ...

  9. Dengue on islands: a Bayesian approach to understanding the global ecology of dengue viruses

    OpenAIRE

    Feldstein, Leora R.; John S Brownstein; Brady, Oliver J.; Simon I Hay; Johansson, Michael A.

    2015-01-01

    Background: Transmission of dengue viruses (DENV), the most common arboviral pathogens globally, is influenced by many climatic and socioeconomic factors. However, the relative contributions of these factors on a global scale are unclear. Methods: We randomly selected 94 islands stratified by socioeconomic and geographic characteristics. With a Bayesian model, we assessed factors contributing to the probability of islands having a history of any dengue outbreaks and of having frequent outbrea...

  10. Bayesian uncertainty analysis for complex physical systems modelled by computer simulators with applications to tipping points

    Science.gov (United States)

    Caiado, C. C. S.; Goldstein, M.

    2015-09-01

    In this paper we present and illustrate basic Bayesian techniques for the uncertainty analysis of complex physical systems modelled by computer simulators. We focus on emulation and history matching and also discuss the treatment of observational errors and structural discrepancies in time series. We exemplify such methods using a four-box model for the termohaline circulation. We show how these methods may be applied to systems containing tipping points and how to treat possible discontinuities using multiple emulators.

  11. Predicting the prognosis of breast cancer by integrating clinical and microarray data with Bayesian networks

    OpenAIRE

    Gevaert, Olivier; De Smet, Frank; Timmerman, Dirk; Moreau, Yves; De Moor, Bart

    2006-01-01

    MOTIVATION: Clinical data, such as patient history, laboratory analysis, ultrasound parameters--which are the basis of day-to-day clinical decision support--are often underused to guide the clinical management of cancer in the presence of microarray data. We propose a strategy based on Bayesian networks to treat clinical and microarray data on an equal footing. The main advantage of this probabilistic model is that it allows to integrate these data sources in several ways and that it allows t...

  12. Bayesian Inference of Reticulate Phylogenies under the Multispecies Network Coalescent.

    Science.gov (United States)

    Wen, Dingqiao; Yu, Yun; Nakhleh, Luay

    2016-05-01

    The multispecies coalescent (MSC) is a statistical framework that models how gene genealogies grow within the branches of a species tree. The field of computational phylogenetics has witnessed an explosion in the development of methods for species tree inference under MSC, owing mainly to the accumulating evidence of incomplete lineage sorting in phylogenomic analyses. However, the evolutionary history of a set of genomes, or species, could be reticulate due to the occurrence of evolutionary processes such as hybridization or horizontal gene transfer. We report on a novel method for Bayesian inference of genome and species phylogenies under the multispecies network coalescent (MSNC). This framework models gene evolution within the branches of a phylogenetic network, thus incorporating reticulate evolutionary processes, such as hybridization, in addition to incomplete lineage sorting. As phylogenetic networks with different numbers of reticulation events correspond to points of different dimensions in the space of models, we devise a reversible-jump Markov chain Monte Carlo (RJMCMC) technique for sampling the posterior distribution of phylogenetic networks under MSNC. We implemented the methods in the publicly available, open-source software package PhyloNet and studied their performance on simulated and biological data. The work extends the reach of Bayesian inference to phylogenetic networks and enables new evolutionary analyses that account for reticulation. PMID:27144273

  13. Bayesian predictive modeling for genomic based personalized treatment selection.

    Science.gov (United States)

    Ma, Junsheng; Stingo, Francesco C; Hobbs, Brian P

    2016-06-01

    Efforts to personalize medicine in oncology have been limited by reductive characterizations of the intrinsically complex underlying biological phenomena. Future advances in personalized medicine will rely on molecular signatures that derive from synthesis of multifarious interdependent molecular quantities requiring robust quantitative methods. However, highly parameterized statistical models when applied in these settings often require a prohibitively large database and are sensitive to proper characterizations of the treatment-by-covariate interactions, which in practice are difficult to specify and may be limited by generalized linear models. In this article, we present a Bayesian predictive framework that enables the integration of a high-dimensional set of genomic features with clinical responses and treatment histories of historical patients, providing a probabilistic basis for using the clinical and molecular information to personalize therapy for future patients. Our work represents one of the first attempts to define personalized treatment assignment rules based on large-scale genomic data. We use actual gene expression data acquired from The Cancer Genome Atlas in the settings of leukemia and glioma to explore the statistical properties of our proposed Bayesian approach for personalizing treatment selection. The method is shown to yield considerable improvements in predictive accuracy when compared to penalized regression approaches. PMID:26575856

  14. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  15. Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements

    Science.gov (United States)

    Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.

    2016-04-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.

  16. Bayesian redshift-space distortions correction from galaxy redshift surveys

    Science.gov (United States)

    Kitaura, Francisco-Shu; Ata, Metin; Angulo, Raul E.; Chuang, Chia-Hsun; Rodríguez-Torres, Sergio; Monteagudo, Carlos Hernández; Prada, Francisco; Yepes, Gustavo

    2016-03-01

    We present a Bayesian reconstruction method which maps a galaxy distribution from redshift- to real-space inferring the distances of the individual galaxies. The method is based on sampling density fields assuming a lognormal prior with a likelihood modelling non-linear stochastic bias. Coherent redshift-space distortions are corrected in a Gibbs-sampling procedure by moving the galaxies from redshift- to real-space according to the peculiar motions derived from the recovered density field using linear theory. The virialized distortions are corrected by sampling candidate real-space positions along the line of sight, which are compatible with the bulk flow corrected redshift-space position adding a random dispersion term in high-density collapsed regions (defined by the eigenvalues of the Hessian). This approach presents an alternative method to estimate the distances to galaxies using the three-dimensional spatial information, and assuming isotropy. Hence the number of applications is very broad. In this work, we show the potential of this method to constrain the growth rate up to k ˜ 0.3 h Mpc-1. Furthermore it could be useful to correct for photometric redshift errors, and to obtain improved baryon acoustic oscillations (BAO) reconstructions.

  17. The life history of Pseudometagea schwarzii, with a discussion of the evolution of endoparasitism and koinobiosis in Eucharitidae and Perilampidae (Chalcidoidea

    Directory of Open Access Journals (Sweden)

    John Heraty

    2013-10-01

    Full Text Available The immature stages and behavior of Pseudometagea schwarzii (Ashmead (Hymenoptera: Eucharitidae: Eucharitini are described, and the presence of an endoparasitic planidium that undergoes growth-feeding in the larva of the host ant (Lasius neoniger Emery is confirmed. Bayesian inference and parsimony ancestral state reconstruction are used to map the evolution of endoparasitism across the eucharitid-perilampid clade. Endoparasitism is proposed to have evolved independently three times within Eucharitidae, including once in Pseudometagea Ashmead, and at least twice in Perilampus Latreille. Endoparasitism is independent as an evolutionary trait from other life history traits such as differences in growth and development of the first-instar larva, hypermetamorphic larval morphology, and other biological traits, including koinobiosis.

  18. Bayesian-based Wavelet Shrinkage for SAR Image Despeckling Using Cycle Spinning

    Institute of Scientific and Technical Information of China (English)

    ZHANG De-xiang; GAO Qing-wei; CHEN Jun-ning

    2006-01-01

    A novel and efficient speckle noise reduction algorithm based on Bayesian wavelet shrinkage using cycle spinning is proposed. First, the sub-band decompositions of non-logarithmically transformed SAR images are shown. Then, a Bayesian wavelet shrinkage factor is applied to the decomposed data to estimate noise-free wavelet coefficients. The method is based on the Mixture Gaussian Distributed (MGD) modeling of sub-band coefficients. Finally, multi-resolution wavelet coefficients are reconstructed by wavelet-threshold using cycle spinning. Experimental results show that the proposed despeckling algorithm is possible to achieve an excellent balance between suppresses speckle effectively and preserves as many image details and sharpness as possible. The new method indicated its higher performance than the other speckle noise reduction techniques and minimizing the effect of pseudo-Gibbs phenomena.

  19. Physalis and physaloids: A recent and complex evolutionary history.

    Science.gov (United States)

    Zamora-Tavares, María Del Pilar; Martínez, Mahinda; Magallón, Susana; Guzmán-Dávalos, Laura; Vargas-Ponce, Ofelia

    2016-07-01

    The complex evolutionary history of the subtribe Physalinae is reflected in the poor resolution of the relationships of Physalis and the physaloid genera. We hypothesize that this low resolution is caused by recent evolutionary history in a complex geographic setting. The aims of this study were twofold: (1) To determine the phylogenetic relationships of the current genera recognized in Physalinae in order to identify monophyletic groups and resolve the physaloid grade; and (2) to determine the probable causes of the recent divergence in Physalinae. We conducted phylogenetic analyses with maximum likelihood (ML) and Bayesian inference with 50 Physalinae species and 19 others as outgroups, using morphological and molecular data from five plastid and two nuclear regions. A relaxed molecular clock was obtained from the ML topology and ancestral area reconstruction was conducted using the DEC model. The genera Chamaesaracha, Leucophysalis, and Physalis subgenus Rydbergis were recovered as monophyletic. Three clades, Alkekengi-Calliphysalis, Schraderanthus-Tzeltalia, and Witheringia-Brachistus, also received good support. However, even with morphological data and that of the DNA of seven regions, the tree was not completely resolved and many clades remained unsupported. Physalinae diverged at the end of the Miocene (∼9.22Mya) with one trend indicating that the greatest diversification within the subtribe occurred during the last 5My. The Neotropical region presented the highest probability (45%) of being the ancestral area of Physalinae followed by the Mexican Transition Zone (35%). During the Pliocene and Pleistocene, the geographical areas where species were found experienced significant geological and climatic changes, giving rise to rapid and relatively recent diversification events in Physalinae. Thus, recent origin, high diversification, and morphological complexity have contributed, at least with the currently available methods, to the inability to completely

  20. Essential elements of the preoperative breast reconstruction evaluation.

    Science.gov (United States)

    Cheng, Angela; Losken, Albert

    2015-04-01

    A plethora of options exist for breast reconstruction and preoperative evaluation must be thorough to lead to a successful outcome. We review multiple components of the preoperative assessment including the patient's history, goals, imaging, and key elements of the physical exam. Consideration for tumor biology, staging, need or response to chemotherapy or radiation therapy is important in deciding on immediate versus delayed reconstruction. It is also important to consider the patient's anatomy, breast size and whether the reconstruction will be unilateral or bilateral. The reconstructive surgeon must accommodate all these factors to consider partial or complete mastectomy defects and guide the patient to the most appropriate reconstructive technique whether it be an oncoplastic reduction mammoplasty, expander-based reconstruction, immediate implant reconstruction, or immediate versus delayed autologous tissue reconstruction such as the deep inferior epigastric artery perforator (DIEP)/transverse rectus abdominis muscle (TRAM), latissimus, transverse upper gracilis (TUG)/profunda femoris artery perforator (PAP), or gluteal artery perforator (GAP) flaps. PMID:26005641

  1. BAYESIAN APPROACH OF DECISION PROBLEMS

    Directory of Open Access Journals (Sweden)

    DRAGOŞ STUPARU

    2010-01-01

    Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.

  2. Bayesian analysis of volcanic eruptions

    Science.gov (United States)

    Ho, Chih-Hsiang

    1990-10-01

    The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.

  3. Climate history of the Southern Hemisphere Westerlies belt during the last glacial-interglacial transition revealed from lake water oxygen isotope reconstruction of Laguna Potrok Aike (52° S, Argentina

    Directory of Open Access Journals (Sweden)

    J. Zhu

    2014-05-01

    Full Text Available The Southern Hemisphere westerly winds (SHW play a crucial role in the large-scale ocean circulation and global carbon cycling. Accordingly, the reconstruction of its latitudinal position and intensity is essential for understanding global climatic fluctuations during the last glacial cycle. The southernmost part of the South American continent is of great importance for paleoclimate studies as the only continental mass intersecting a large part of the SHW belt. However, continuous proxy records back to the last Glacial are rare in southern Patagonia, owing to the Patagonian Ice Sheets expanding from the Andean area and the scarcity of continuous paleoclimate archives in extra-Andean Patagonia. Here, we present an oxygen isotope record from cellulose and purified bulk organic matter of aquatic moss shoots from the last glacial-interglacial transition preserved in the sediments of Laguna Potrok Aike (52° S, 70° W, a deep maar lake located in semi-arid, extra-Andean Patagonia. The highly significant correlation between oxygen isotope values of aquatic mosses and their host waters and the abundant well-preserved moss remains allow a high-resolution oxygen isotope reconstruction of lake water (δ18Olw for this lake. Long-term δ18Olw variations are mainly determined by δ18O changes of the source water of lake, surface air temperature and evaporative 18O enrichment. Under permafrost conditions during the Glacial, the groundwater may not be recharged by regional precipitation. The isolated groundwater could have had much less negative δ18O values than glacial precipitation. The less 18O depleted source water and prolonged lake water residence time caused by reduced interchange between in- and outflows could have resulted in the reconstructed glacial δ18Olw that was only ca. 3‰ lower than modern values. The significant two-step rise in reconstructed δ18Olw during the last deglaciation demonstrated the response of isotope composition of lake

  4. Morphological homoplasy, life history evolution, and historical biogeography of plethodontid salamanders inferred from complete mitochondrial genomes

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Rachel Lockridge; Macey, J. Robert; Jaekel, Martin; Wake, David B.; Boore, Jeffrey L.

    2004-08-01

    The evolutionary history of the largest salamander family (Plethodontidae) is characterized by extreme morphological homoplasy. Analysis of the mechanisms generating such homoplasy requires an independent, molecular phylogeny. To this end, we sequenced 24 complete mitochondrial genomes (22 plethodontids and two outgroup taxa), added data for three species from GenBank, and performed partitioned and unpartitioned Bayesian, ML, and MP phylogenetic analyses. We explored four dataset partitioning strategies to account for evolutionary process heterogeneity among genes and codon positions, all of which yielded increased model likelihoods and decreased numbers of supported nodes in the topologies (PP > 0.95) relative to the unpartitioned analysis. Our phylogenetic analyses yielded congruent trees that contrast with the traditional morphology-based taxonomy; the monophyly of three out of four major groups is rejected. Reanalysis of current hypotheses in light of these new evolutionary relationships suggests that (1) a larval life history stage re-evolved from a direct-developing ancestor multiple times, (2) there is no phylogenetic support for the ''Out of Appalachia'' hypothesis of plethodontid origins, and (3) novel scenarios must be reconstructed for the convergent evolution of projectile tongues, reduction in toe number, and specialization for defensive tail loss. Some of these novel scenarios imply morphological transformation series that proceed in the opposite direction than was previously thought. In addition, they suggest surprising evolutionary lability in traits previously interpreted to be conservative.

  5. Cosmic expansion history from SN Ia data via information field theory

    CERN Document Server

    Porqueres, Natàlia; Greiner, Maksim; Böhm, Vanessa; Dorn, Sebastian; Ruiz-Lapuente, Pilar; Manrique, Alberto

    2016-01-01

    We present a novel inference algorithm that reconstructs the cosmic expansion history as encoded in the Hubble parameter $H(z)$ from SNe Ia data. The novelty of the approach lies in the usage of information field theory, a statistical field theory that is very well suited for the construction of optimal signal recovery algorithms. The algorithm infers non-parametrically $s(a)=\\ln(\\rho(a)/\\rho_{\\mathrm{crit}0})$, the density evolution which determines $H(z)$, without assuming an analytical form of $\\rho(a)$ but only its smoothness with the scale factor $a=(1+z)^{-1}$. The inference problem of recovering the signal $s(a)$ from the data is formulated in a fully Bayesian way. In detail, we rewrite the signal as the sum of a background cosmology and a perturbation. This allows to determine the maximum a posteriory estimate of the signal by an iterative Wiener filter method. Applying this method to the Union2.1 supernova compilation, we recover a cosmic expansion history that is fully compatible with the standard $...

  6. Bayesians versus frequentists a philosophical debate on statistical reasoning

    CERN Document Server

    Vallverdú, Jordi

    2016-01-01

    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept introduced in the book’s final section. This monograph will be of interest to philosophers and historians of science and students in related fields. Despite the mathematical nature of the topic, no statistical background is required, making the book a valuable read for anyone interested in the history of statistics and human cognition.

  7. Hierarchical Bayesian analysis of high complexity data for the inversion of metric InSAR in urban environments

    OpenAIRE

    Quartulli, Marco Francesco

    2006-01-01

    In this thesis, structured hierarchical Bayesian models and estimators are considered for the analysis of multidimensional datasets representing high complexity phenomena. The analysis is motivated by the problem of urban scene reconstruction and understanding from meter resolution InSAR data, observations of highly diverse, structured settlements through sophisticated, coherent radar based instruments from airborne or spaceborne platforms at distances of up to hundreds of kilometers from ...

  8. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    Science.gov (United States)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  9. Romerrigets historie

    DEFF Research Database (Denmark)

    Christiansen, Erik

    Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t.......Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t....

  10. Intellectual History

    DEFF Research Database (Denmark)

    In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...

  11. Re-telling, Re-evaluating and Re-constructing

    Directory of Open Access Journals (Sweden)

    Gorana Tolja

    2013-11-01

    Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.

  12. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  13. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  14. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574

  15. Bayesian Methods for Medical Test Accuracy

    Directory of Open Access Journals (Sweden)

    Lyle D. Broemeling

    2011-05-01

    Full Text Available Bayesian methods for medical test accuracy are presented, beginning with the basic measures for tests with binary scores: true positive fraction, false positive fraction, positive predictive values, and negative predictive value. The Bayesian approach is taken because of its efficient use of prior information, and the analysis is executed with a Bayesian software package WinBUGS®. The ROC (receiver operating characteristic curve gives the intrinsic accuracy of medical tests that have ordinal or continuous scores, and the Bayesian approach is illustrated with many examples from cancer and other diseases. Medical tests include X-ray, mammography, ultrasound, computed tomography, magnetic resonance imaging, nuclear medicine and tests based on biomarkers, such as blood glucose values for diabetes. The presentation continues with more specialized methods suitable for measuring the accuracies of clinical studies that have verification bias, and medical tests without a gold standard. Lastly, the review is concluded with Bayesian methods for measuring the accuracy of the combination of two or more tests.

  16. A Sparse Bayesian Imaging Technique for Efficient Recovery of Reservoir Channels With Time-Lapse Seismic Measurements

    KAUST Repository

    Sana, Furrukh

    2016-06-01

    Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\\\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.

  17. PROPRIOCEPTION, BODY BALANCE AND FUNCTIONALITY IN INDIVIDUALS WITH ACL RECONSTRUCTION

    OpenAIRE

    Furlanetto, Tássia Silveira; Peyré-Tartaruga, Leonardo Alexandre; do Pinho, Alexandre Severo; Bernardes, Emanuele da Silva; Zaro, Milton Antonio

    2016-01-01

    Objective : To evaluate and compare proprioception, body balance and knee functionality of individuals with or without unilateral anterior cruciate ligament (ACL) reconstruction. Methods : Forty individuals were divided in two groups: Experimental group, 20 individuals with ACL reconstruction at six months postoperative, and control group, 20 individuals with no history of lower limb pathologies. In the experimental group, we assessed lower limbs with reconstructed ACL and contralateral limb;...

  18. A Large Sample Study of the Bayesian Bootstrap

    OpenAIRE

    Lo, Albert Y.

    1987-01-01

    An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.

  19. Reconstructing the Alcatraz escape

    Science.gov (United States)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  20. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...