WorldWideScience

Sample records for bayesian history reconstruction

  1. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul; Al-Naffouri, Tareq Y.

    2012-01-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical

  2. Bayesian tomographic reconstruction of microsystems

    International Nuclear Information System (INIS)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-01-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations

  3. Cophylogeny reconstruction via an approximate Bayesian computation.

    Science.gov (United States)

    Baudet, C; Donati, B; Sinaimeri, B; Crescenzi, P; Gautier, C; Matias, C; Sagot, M-F

    2015-05-01

    Despite an increasingly vast literature on cophylogenetic reconstructions for studying host-parasite associations, understanding the common evolutionary history of such systems remains a problem that is far from being solved. Most algorithms for host-parasite reconciliation use an event-based model, where the events include in general (a subset of) cospeciation, duplication, loss, and host switch. All known parsimonious event-based methods then assign a cost to each type of event in order to find a reconstruction of minimum cost. The main problem with this approach is that the cost of the events strongly influences the reconciliation obtained. Some earlier approaches attempt to avoid this problem by finding a Pareto set of solutions and hence by considering event costs under some minimization constraints. To deal with this problem, we developed an algorithm, called Coala, for estimating the frequency of the events based on an approximate Bayesian computation approach. The benefits of this method are 2-fold: (i) it provides more confidence in the set of costs to be used in a reconciliation, and (ii) it allows estimation of the frequency of the events in cases where the data set consists of trees with a large number of taxa. We evaluate our method on simulated and on biological data sets. We show that in both cases, for the same pair of host and parasite trees, different sets of frequencies for the events lead to equally probable solutions. Moreover, often these solutions differ greatly in terms of the number of inferred events. It appears crucial to take this into account before attempting any further biological interpretation of such reconciliations. More generally, we also show that the set of frequencies can vary widely depending on the input host and parasite trees. Indiscriminately applying a standard vector of costs may thus not be a good strategy. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  4. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  5. A Bayesian account of quantum histories

    International Nuclear Information System (INIS)

    Marlow, Thomas

    2006-01-01

    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory

  6. Bayesian image reconstruction for improving detection performance of muon tomography.

    Science.gov (United States)

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  7. Hierarchical Bayesian sparse image reconstruction with application to MRFM.

    Science.gov (United States)

    Dobigeon, Nicolas; Hero, Alfred O; Tourneret, Jean-Yves

    2009-09-01

    This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g., by maximizing the estimated posterior distribution. In our fully Bayesian approach, the posteriors of all the parameters are available. Thus, our algorithm provides more information than other previously proposed sparse reconstruction methods that only give a point estimate. The performance of the proposed hierarchical Bayesian sparse reconstruction method is illustrated on synthetic data and real data collected from a tobacco virus sample using a prototype MRFM instrument.

  8. On a full Bayesian inference for force reconstruction problems

    Science.gov (United States)

    Aucejo, M.; De Smet, O.

    2018-05-01

    In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.

  9. Reconstruction of elongated bubbles fusing the information from multiple optical probes through a Bayesian inference technique

    Energy Technology Data Exchange (ETDEWEB)

    Chakraborty, Shubhankar; Das, Prasanta Kr., E-mail: pkd@mech.iitkgp.ernet.in [Department of Mechanical Engineering, Indian Institute of Technology Kharagpur, Kharagpur 721302 (India); Roy Chaudhuri, Partha [Department of Physics, Indian Institute of Technology Kharagpur, Kharagpur 721302 (India)

    2016-07-15

    In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.

  10. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.

    2014-01-01

    Roč. 9, č. 2 (2014), Art . no. e87436 E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014

  11. Bayesian signal reconstruction for 1-bit compressed sensing

    International Nuclear Information System (INIS)

    Xu, Yingying; Kabashima, Yoshiyuki; Zdeborová, Lenka

    2014-01-01

    The 1-bit compressed sensing framework enables the recovery of a sparse vector x from the sign information of each entry of its linear transformation. Discarding the amplitude information can significantly reduce the amount of data, which is highly beneficial in practical applications. In this paper, we present a Bayesian approach to signal reconstruction for 1-bit compressed sensing and analyze its typical performance using statistical mechanics. As a basic setup, we consider the case that the measuring matrix Φ has i.i.d entries and the measurements y are noiseless. Utilizing the replica method, we show that the Bayesian approach enables better reconstruction than the l 1 -norm minimization approach, asymptotically saturating the performance obtained when the non-zero entry positions of the signal are known, for signals whose non-zero entries follow zero mean Gaussian distributions. We also test a message passing algorithm for signal reconstruction on the basis of belief propagation. The results of numerical experiments are consistent with those of the theoretical analysis. (paper)

  12. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    International Nuclear Information System (INIS)

    Barat, Eric; Dautremer, Thomas

    2007-01-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on R k (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  13. Reconstructing Fire History: An Exercise in Dendrochronology

    Science.gov (United States)

    Lafon, Charles W.

    2005-01-01

    Dendrochronology is used widely to reconstruct the history of forest disturbances. I created an exercise that introduces the use of dendrochronology to investigate fire history and forest dynamics. The exercise also demonstrates how the dendrochronological technique of crossdating is employed to age dead trees and identify missing rings. I…

  14. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  15. Sparse reconstruction using distribution agnostic bayesian matching pursuit

    KAUST Repository

    Masood, Mudassir; Al-Naffouri, Tareq Y.

    2013-01-01

    A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics

  16. Awakening the BALROG: BAyesian Location Reconstruction Of GRBs

    Science.gov (United States)

    Burgess, J. Michael; Yu, Hoi-Fung; Greiner, Jochen; Mortlock, Daniel J.

    2018-05-01

    The accurate spatial location of gamma-ray bursts (GRBs) is crucial for both accurately characterizing their spectra and follow-up observations by other instruments. The Fermi Gamma-ray Burst Monitor (GBM) has the largest field of view for detecting GRBs as it views the entire unocculted sky, but as a non-imaging instrument it relies on the relative count rates observed in each of its 14 detectors to localize transients. Improving its ability to accurately locate GRBs and other transients is vital to the paradigm of multimessenger astronomy, including the electromagnetic follow-up of gravitational wave signals. Here we present the BAyesian Location Reconstruction Of GRBs (BALROG) method for localizing and characterizing GBM transients. Our approach eliminates the systematics of previous approaches by simultaneously fitting for the location and spectrum of a source. It also correctly incorporates the uncertainties in the location of a transient into the spectral parameters and produces reliable positional uncertainties for both well-localized sources and those for which the GBM data cannot effectively constrain the position. While computationally expensive, BALROG can be implemented to enable quick follow-up of all GBM transient signals. Also, we identify possible response problems that require attention and caution when using standard, public GBM detector response matrices. Finally, we examine the effects of including the uncertainty in location on the spectral parameters of GRB 080916C. We find that spectral parameters change and no extra components are required when these effects are included in contrast to when we use a fixed location. This finding has the potential to alter both the GRB spectral catalogues and the reported spectral composition of some well-known GRBs.

  17. Diffusion archeology for diffusion progression history reconstruction.

    Science.gov (United States)

    Sefer, Emre; Kingsford, Carl

    2016-11-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.

  18. Efficient reconstruction of contaminant release history

    Energy Technology Data Exchange (ETDEWEB)

    Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL

    2009-01-01

    We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.

  19. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  20. Sparse reconstruction using distribution agnostic bayesian matching pursuit

    KAUST Repository

    Masood, Mudassir

    2013-11-01

    A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.

  1. BUMPER: the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction

    Science.gov (United States)

    Holden, Phil; Birks, John; Brooks, Steve; Bush, Mark; Hwang, Grace; Matthews-Bird, Frazer; Valencia, Bryan; van Woesik, Robert

    2017-04-01

    We describe the Bayesian User-friendly Model for Palaeo-Environmental Reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. The principal motivation for a Bayesian approach is that the palaeoenvironment is treated probabilistically, and can be updated as additional data become available. Bayesian approaches therefore provide a reconstruction-specific quantification of the uncertainty in the data and in the model parameters. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring 2 seconds to build a 100-taxon model from a 100-site training-set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training-sets under ideal assumptions. We then use these to demonstrate both the general applicability of the model and the sensitivity of reconstructions to the characteristics of the training-set, considering assemblage richness, taxon tolerances, and the number of training sites. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. In all of these applications an identically configured model is used, the only change being the input files that provide the training-set environment and taxon-count data.

  2. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  3. Bayesian reconstruction of gravitational wave bursts using chirplets

    Science.gov (United States)

    Millhouse, Margaret; Cornish, Neil J.; Littenberg, Tyson

    2018-05-01

    The LIGO-Virgo Collaboration uses a variety of techniques to detect and characterize gravitational waves. One approach is to use templates—models for the signals derived from Einstein's equations. Another approach is to extract the signals directly from the coherent response of the detectors in the LIGO-Virgo network. Both approaches played an important role in the first gravitational wave detections. Here we extend the BayesWave analysis algorithm, which reconstructs gravitational wave signals using a collection of continuous wavelets, to use a generalized wavelet family, known as chirplets, that have time-evolving frequency content. Since generic gravitational wave signals have frequency content that evolves in time, a collection of chirplets provides a more compact representation of the signal, resulting in more accurate waveform reconstructions, especially for low signal-to-noise events, and events that occupy a large time-frequency volume.

  4. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  5. Bayesian PET image reconstruction incorporating anato-functional joint entropy

    International Nuclear Information System (INIS)

    Tang Jing; Rahmim, Arman

    2009-01-01

    We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.

  6. Bayesian reconstruction of seafloor shape from side-scan sonar measurements using a Markov Random Field

    OpenAIRE

    Woock, P.; Pak, Alexey

    2014-01-01

    To explore the seafloor, a side-scan sonar emits a directed acoustic signal and then records the returning (reflected) signal intensity as a function of time. The inversion of that process is not unique: multiple shapes may lead to identical measured responses. In this work, we suggest a Bayesian approach to reconstructing the 3D shape of the seafloor from multiple sonar measurements, inspired by the state-of-the-art methods of inverse raytracing that originated in computer vision. The space ...

  7. Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE

  8. Applying Bayesian neural networks to event reconstruction in reactor neutrino experiments

    International Nuclear Information System (INIS)

    Xu Ye; Xu Weiwei; Meng Yixiong; Zhu Kaien; Xu Wei

    2008-01-01

    A toy detector has been designed to simulate central detectors in reactor neutrino experiments in the paper. The electron samples from the Monte-Carlo simulation of the toy detector have been reconstructed by the method of Bayesian neural networks (BNNs) and the standard algorithm, a maximum likelihood method (MLD), respectively. The result of the event reconstruction using BNN has been compared with the one using MLD. Compared to MLD, the uncertainties of the electron vertex are not improved, but the energy resolutions are significantly improved using BNN. And the improvement is more obvious for the high energy electrons than the low energy ones

  9. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  10. Bayesian image reconstruction for emission tomography based on median root prior

    International Nuclear Information System (INIS)

    Alenius, S.

    1997-01-01

    The aim of the present study was to investigate a new type of Bayesian one-step late reconstruction method which utilizes a median root prior (MRP). The method favours images which have locally monotonous radioactivity concentrations. The new reconstruction algorithm was applied to ideal simulated data, phantom data and some patient examinations with PET. The same projection data were reconstructed with filtered back-projection (FBP) and maximum likelihood-expectation maximization (ML-EM) methods for comparison. The MRP method provided good-quality images with a similar resolution to the FBP method with a ramp filter, and at the same time the noise properties were as good as with Hann-filtered FBP images. The typical artefacts seen in FBP reconstructed images outside of the object were completely removed, as was the grainy noise inside the object. Quantitativley, the resulting average regional radioactivity concentrations in a large region of interest in images produced by the MRP method corresponded to the FBP and ML-EM results but at the pixel by pixel level the MRP method proved to be the most accurate of the tested methods. In contrast to other iterative reconstruction methods, e.g. ML-EM, the MRP method was not sensitive to the number of iterations nor to the adjustment of reconstruction parameters. Only the Bayesian parameter β had to be set. The proposed MRP method is much more simple to calculate than the methods described previously, both with regard to the parameter settings and in terms of general use. The new MRP reconstruction method was shown to produce high-quality quantitative emission images with only one parameter setting in addition to the number of iterations. (orig.)

  11. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  12. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems.

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  13. Fast gradient-based methods for Bayesian reconstruction of transmission and emission PET images

    International Nuclear Information System (INIS)

    Mumcuglu, E.U.; Leahy, R.; Zhou, Z.; Cherry, S.R.

    1994-01-01

    The authors describe conjugate gradient algorithms for reconstruction of transmission and emission PET images. The reconstructions are based on a Bayesian formulation, where the data are modeled as a collection of independent Poisson random variables and the image is modeled using a Markov random field. A conjugate gradient algorithm is used to compute a maximum a posteriori (MAP) estimate of the image by maximizing over the posterior density. To ensure nonnegativity of the solution, a penalty function is used to convert the problem to one of unconstrained optimization. Preconditioners are used to enhance convergence rates. These methods generally achieve effective convergence in 15--25 iterations. Reconstructions are presented of an 18 FDG whole body scan from data collected using a Siemens/CTI ECAT931 whole body system. These results indicate significant improvements in emission image quality using the Bayesian approach, in comparison to filtered backprojection, particularly when reprojections of the MAP transmission image are used in place of the standard attenuation correction factors

  14. Diffusion archeology for diffusion progression history reconstruction

    OpenAIRE

    Sefer, Emre; Kingsford, Carl

    2015-01-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring — perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial d...

  15. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  16. Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum

    Science.gov (United States)

    Weitzel, Nils; Hense, Andreas; Ohlwein, Christian

    2017-04-01

    Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were

  17. Bayesian model selection of template forward models for EEG source reconstruction.

    Science.gov (United States)

    Strobbe, Gregor; van Mierlo, Pieter; De Vos, Maarten; Mijović, Bogdan; Hallez, Hans; Van Huffel, Sabine; López, José David; Vandenberghe, Stefaan

    2014-06-01

    Several EEG source reconstruction techniques have been proposed to identify the generating neuronal sources of electrical activity measured on the scalp. The solution of these techniques depends directly on the accuracy of the forward model that is inverted. Recently, a parametric empirical Bayesian (PEB) framework for distributed source reconstruction in EEG/MEG was introduced and implemented in the Statistical Parametric Mapping (SPM) software. The framework allows us to compare different forward modeling approaches, using real data, instead of using more traditional simulated data from an assumed true forward model. In the absence of a subject specific MR image, a 3-layered boundary element method (BEM) template head model is currently used including a scalp, skull and brain compartment. In this study, we introduced volumetric template head models based on the finite difference method (FDM). We constructed a FDM head model equivalent to the BEM model and an extended FDM model including CSF. These models were compared within the context of three different types of source priors related to the type of inversion used in the PEB framework: independent and identically distributed (IID) sources, equivalent to classical minimum norm approaches, coherence (COH) priors similar to methods such as LORETA, and multiple sparse priors (MSP). The resulting models were compared based on ERP data of 20 subjects using Bayesian model selection for group studies. The reconstructed activity was also compared with the findings of previous studies using functional magnetic resonance imaging. We found very strong evidence in favor of the extended FDM head model with CSF and assuming MSP. These results suggest that the use of realistic volumetric forward models can improve PEB EEG source reconstruction. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Accurate reconstruction of insertion-deletion histories by statistical phylogenetics.

    Directory of Open Access Journals (Sweden)

    Oscar Westesson

    Full Text Available The Multiple Sequence Alignment (MSA is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history, it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of several methods for reconstruction of indel history. The methods tested include a relatively new algorithm for statistical marginalization of MSAs that sums over a stochastically-sampled ensemble of the most probable evolutionary histories. For mammalian evolutionary parameters on several different trees, the single most likely history sampled by our algorithm appears less biased than histories reconstructed by other MSA methods. The algorithm can also be used for alignment-free inference, where the MSA is explicitly summed out of the analysis. As an illustration of our method, we discuss reconstruction of the evolutionary histories of human protein-coding genes.

  19. Reconstructing galaxy histories from globular clusters.

    Science.gov (United States)

    West, Michael J; Côté, Patrick; Marzke, Ronald O; Jordán, Andrés

    2004-01-01

    Nearly a century after the true nature of galaxies as distant 'island universes' was established, their origin and evolution remain great unsolved problems of modern astrophysics. One of the most promising ways to investigate galaxy formation is to study the ubiquitous globular star clusters that surround most galaxies. Globular clusters are compact groups of up to a few million stars. They generally formed early in the history of the Universe, but have survived the interactions and mergers that alter substantially their parent galaxies. Recent advances in our understanding of the globular cluster systems of the Milky Way and other galaxies point to a complex picture of galaxy genesis driven by cannibalism, collisions, bursts of star formation and other tumultuous events.

  20. Reconstructing the genetic history of late Neanderthals.

    Science.gov (United States)

    Hajdinjak, Mateja; Fu, Qiaomei; Hübner, Alexander; Petr, Martin; Mafessoni, Fabrizio; Grote, Steffi; Skoglund, Pontus; Narasimham, Vagheesh; Rougier, Hélène; Crevecoeur, Isabelle; Semal, Patrick; Soressi, Marie; Talamo, Sahra; Hublin, Jean-Jacques; Gušić, Ivan; Kućan, Željko; Rudan, Pavao; Golovanova, Liubov V; Doronichev, Vladimir B; Posth, Cosimo; Krause, Johannes; Korlević, Petra; Nagel, Sarah; Nickel, Birgit; Slatkin, Montgomery; Patterson, Nick; Reich, David; Prüfer, Kay; Meyer, Matthias; Pääbo, Svante; Kelso, Janet

    2018-03-29

    Although it has previously been shown that Neanderthals contributed DNA to modern humans, not much is known about the genetic diversity of Neanderthals or the relationship between late Neanderthal populations at the time at which their last interactions with early modern humans occurred and before they eventually disappeared. Our ability to retrieve DNA from a larger number of Neanderthal individuals has been limited by poor preservation of endogenous DNA and contamination of Neanderthal skeletal remains by large amounts of microbial and present-day human DNA. Here we use hypochlorite treatment of as little as 9 mg of bone or tooth powder to generate between 1- and 2.7-fold genomic coverage of five Neanderthals who lived around 39,000 to 47,000 years ago (that is, late Neanderthals), thereby doubling the number of Neanderthals for which genome sequences are available. Genetic similarity among late Neanderthals is well predicted by their geographical location, and comparison to the genome of an older Neanderthal from the Caucasus indicates that a population turnover is likely to have occurred, either in the Caucasus or throughout Europe, towards the end of Neanderthal history. We find that the bulk of Neanderthal gene flow into early modern humans originated from one or more source populations that diverged from the Neanderthals that were studied here at least 70,000 years ago, but after they split from a previously sequenced Neanderthal from Siberia around 150,000 years ago. Although four of the Neanderthals studied here post-date the putative arrival of early modern humans into Europe, we do not detect any recent gene flow from early modern humans in their ancestry.

  1. Bayesian Estimator for Angle Recovery: Event Classification and Reconstruction in Positron Emission Tomography

    International Nuclear Information System (INIS)

    Foudray, Angela M K; Levin, Craig S

    2007-01-01

    PET at the highest level is an inverse problem: reconstruct the location of the emission (which localize biological function) from detected photons. Ideally, one would like to directly measure an annihilation photon's incident direction on the detector. In the developed algorithm, Bayesian Estimation for Angle Recovery (BEAR), we utilized the increased information gathered from localizing photon interactions in the detector and developed a Bayesian estimator for a photon's incident direction. Probability distribution functions (PDFs) were filled using an interaction energy weighted mean or center of mass (COM) reference space, which had the following computational advantages: (1) a significant reduction in the size of the data in measurement space, making further manipulation and searches faster (2) the construction of COM space does not depend on measurement location, it takes advantage of measurement symmetries, and data can be added to the training set without knowledge and recalculation of prior training data, (3) calculation of posterior probability map is fully parallelizable, it can scale to any number of processors. These PDFs were used to estimate the point spread function (PSF) in incident angle space for (i) algorithm assessment and (ii) to provide probability selection criteria for classification. The algorithm calculates both the incident θ and φ angle, with ∼16 degrees RMS in both angles, limiting the incoming direction to a narrow cone. Feature size did not improve using the BEAR algorithm as an angle filter, but the contrast ratio improved 40% on average

  2. Bayesian image reconstruction in SPECT using higher order mechanical models as priors

    International Nuclear Information System (INIS)

    Lee, S.J.; Gindi, G.; Rangarajan, A.

    1995-01-01

    While the ML-EM (maximum-likelihood-expectation maximization) algorithm for reconstruction for emission tomography is unstable due to the ill-posed nature of the problem, Bayesian reconstruction methods overcome this instability by introducing prior information, often in the form of a spatial smoothness regularizer. More elaborate forms of smoothness constraints may be used to extend the role of the prior beyond that of a stabilizer in order to capture actual spatial information about the object. Previously proposed forms of such prior distributions were based on the assumption of a piecewise constant source distribution. Here, the authors propose an extension to a piecewise linear model--the weak plate--which is more expressive than the piecewise constant model. The weak plate prior not only preserves edges but also allows for piecewise ramplike regions in the reconstruction. Indeed, for the application in SPECT, such ramplike regions are observed in ground-truth source distributions in the form of primate autoradiographs of rCBF radionuclides. To incorporate the weak plate prior in a MAP approach, the authors model the prior as a Gibbs distribution and use a GEM formulation for the optimization. They compare quantitative performance of the ML-EM algorithm, a GEM algorithm with a prior favoring piecewise constant regions, and a GEM algorithm with the weak plate prior. Pointwise and regional bias and variance of ensemble image reconstructions are used as indications of image quality. The results show that the weak plate and membrane priors exhibit improved bias and variance relative to ML-EM techniques

  3. Bayesian Models for Streamflow and River Network Reconstruction using Tree Rings

    Science.gov (United States)

    Ravindranath, A.; Devineni, N.

    2016-12-01

    Water systems face non-stationary, dynamically shifting risks due to shifting societal conditions and systematic long-term variations in climate manifesting as quasi-periodic behavior on multi-decadal time scales. Water systems are thus vulnerable to long periods of wet or dry hydroclimatic conditions. Streamflow is a major component of water systems and a primary means by which water is transported to serve ecosystems' and human needs. Thus, our concern is in understanding streamflow variability. Climate variability and impacts on water resources are crucial factors affecting streamflow, and multi-scale variability increases risk to water sustainability and systems. Dam operations are necessary for collecting water brought by streamflow while maintaining downstream ecological health. Rules governing dam operations are based on streamflow records that are woefully short compared to periods of systematic variation present in the climatic factors driving streamflow variability and non-stationarity. We use hierarchical Bayesian regression methods in order to reconstruct paleo-streamflow records for dams within a basin using paleoclimate proxies (e.g. tree rings) to guide the reconstructions. The riverine flow network for the entire basin is subsequently modeled hierarchically using feeder stream and tributary flows. This is a starting point in analyzing streamflow variability and risks to water systems, and developing a scientifically-informed dynamic risk management framework for formulating dam operations and water policies to best hedge such risks. We will apply this work to the Missouri and Delaware River Basins (DRB). Preliminary results of streamflow reconstructions for eight dams in the upper DRB using standard Gaussian regression with regional tree ring chronologies give streamflow records that now span two to two and a half centuries, and modestly smoothed versions of these reconstructed flows indicate physically-justifiable trends in the time series.

  4. Bayesian 3D X-ray computed tomography image reconstruction with a scaled Gaussian mixture prior model

    International Nuclear Information System (INIS)

    Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali

    2015-01-01

    In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 512 3 to 8192 3 voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and H t (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections

  5. Algorithmic procedures for Bayesian MEG/EEG source reconstruction in SPM.

    Science.gov (United States)

    López, J D; Litvak, V; Espinosa, J J; Friston, K; Barnes, G R

    2014-01-01

    The MEG/EEG inverse problem is ill-posed, giving different source reconstructions depending on the initial assumption sets. Parametric Empirical Bayes allows one to implement most popular MEG/EEG inversion schemes (Minimum Norm, LORETA, etc.) within the same generic Bayesian framework. It also provides a cost-function in terms of the variational Free energy-an approximation to the marginal likelihood or evidence of the solution. In this manuscript, we revisit the algorithm for MEG/EEG source reconstruction with a view to providing a didactic and practical guide. The aim is to promote and help standardise the development and consolidation of other schemes within the same framework. We describe the implementation in the Statistical Parametric Mapping (SPM) software package, carefully explaining each of its stages with the help of a simple simulated data example. We focus on the Multiple Sparse Priors (MSP) model, which we compare with the well-known Minimum Norm and LORETA models, using the negative variational Free energy for model comparison. The manuscript is accompanied by Matlab scripts to allow the reader to test and explore the underlying algorithm. © 2013. Published by Elsevier Inc. All rights reserved.

  6. Algorithmic procedures for Bayesian MEG/EEG source reconstruction in SPM☆

    Science.gov (United States)

    López, J.D.; Litvak, V.; Espinosa, J.J.; Friston, K.; Barnes, G.R.

    2014-01-01

    The MEG/EEG inverse problem is ill-posed, giving different source reconstructions depending on the initial assumption sets. Parametric Empirical Bayes allows one to implement most popular MEG/EEG inversion schemes (Minimum Norm, LORETA, etc.) within the same generic Bayesian framework. It also provides a cost-function in terms of the variational Free energy—an approximation to the marginal likelihood or evidence of the solution. In this manuscript, we revisit the algorithm for MEG/EEG source reconstruction with a view to providing a didactic and practical guide. The aim is to promote and help standardise the development and consolidation of other schemes within the same framework. We describe the implementation in the Statistical Parametric Mapping (SPM) software package, carefully explaining each of its stages with the help of a simple simulated data example. We focus on the Multiple Sparse Priors (MSP) model, which we compare with the well-known Minimum Norm and LORETA models, using the negative variational Free energy for model comparison. The manuscript is accompanied by Matlab scripts to allow the reader to test and explore the underlying algorithm. PMID:24041874

  7. Crossing statistic: reconstructing the expansion history of the universe

    International Nuclear Information System (INIS)

    Shafieloo, Arman

    2012-01-01

    We present that by combining Crossing Statistic [1,2] and Smoothing method [3-5] one can reconstruct the expansion history of the universe with a very high precision without considering any prior on the cosmological quantities such as the equation of state of dark energy. We show that the presented method performs very well in reconstruction of the expansion history of the universe independent of the underlying models and it works well even for non-trivial dark energy models with fast or slow changes in the equation of state of dark energy. Accuracy of the reconstructed quantities along with independence of the method to any prior or assumption gives the proposed method advantages to the other non-parametric methods proposed before in the literature. Applying on the Union 2.1 supernovae combined with WiggleZ BAO data we present the reconstructed results and test the consistency of the two data sets in a model independent manner. Results show that latest available supernovae and BAO data are in good agreement with each other and spatially flat ΛCDM model is in concordance with the current data

  8. Applications of Bayesian temperature profile reconstruction to automated comparison with heat transport models and uncertainty quantification of current diffusion

    International Nuclear Information System (INIS)

    Irishkin, M.; Imbeaux, F.; Aniel, T.; Artaud, J.F.

    2015-01-01

    Highlights: • We developed a method for automated comparison of experimental data with models. • A unique platform implements Bayesian analysis and integrated modelling tools. • The method is tokamak-generic and is applied to Tore Supra and JET pulses. • Validation of a heat transport model is carried out. • We quantified the uncertainties due to Te profiles in current diffusion simulations. - Abstract: In the context of present and future long pulse tokamak experiments yielding a growing size of measured data per pulse, automating data consistency analysis and comparisons of measurements with models is a critical matter. To address these issues, the present work describes an expert system that carries out in an integrated and fully automated way (i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis (ii) a prediction of the reconstructed quantities, according to some models and (iii) a comparison of the first two steps. The first application shown is devoted to the development of an automated comparison method between the experimental plasma profiles reconstructed using Bayesian methods and time dependent solutions of the transport equations. The method was applied to model validation of a simple heat transport model with three radial shape options. It has been tested on a database of 21 Tore Supra and 14 JET shots. The second application aims at quantifying uncertainties due to the electron temperature profile in current diffusion simulations. A systematic reconstruction of the Ne, Te, Ti profiles was first carried out for all time slices of the pulse. The Bayesian 95% highest probability intervals on the Te profile reconstruction were then used for (i) data consistency check of the flux consumption and (ii) defining a confidence interval for the current profile simulation. The method has been applied to one Tore Supra pulse and one JET pulse.

  9. A Bayesian approach to PET reconstruction using image-modeling Gibbs priors: Implementation and comparison

    International Nuclear Information System (INIS)

    Chan, M.T.; Herman, G.T.; Levitan, E.

    1996-01-01

    We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach

  10. Bayesian reconstruction of photon interaction sequences for high-resolution PET detectors

    Energy Technology Data Exchange (ETDEWEB)

    Pratx, Guillem; Levin, Craig S [Molecular Imaging Program at Stanford, Department of Radiology, Stanford, CA (United States)], E-mail: cslevin@stanford.edu

    2009-09-07

    Realizing the full potential of high-resolution positron emission tomography (PET) systems involves accurately positioning events in which the annihilation photon deposits all its energy across multiple detector elements. Reconstructing the complete sequence of interactions of each photon provides a reliable way to select the earliest interaction because it ensures that all the interactions are consistent with one another. Bayesian estimation forms a natural framework to maximize the consistency of the sequence with the measurements while taking into account the physics of {gamma}-ray transport. An inherently statistical method, it accounts for the uncertainty in the measured energy and position of each interaction. An algorithm based on maximum a posteriori (MAP) was evaluated for computer simulations. For a high-resolution PET system based on cadmium zinc telluride detectors, 93.8% of the recorded coincidences involved at least one photon multiple-interactions event (PMIE). The MAP estimate of the first interaction was accurate for 85.2% of the single photons. This represents a two-fold reduction in the number of mispositioned events compared to minimum pair distance, a simpler yet efficient positioning method. The point-spread function of the system presented lower tails and higher peak value when MAP was used. This translated into improved image quality, which we quantified by studying contrast and spatial resolution gains.

  11. Reconstructing the massive black hole cosmic history through gravitational waves

    International Nuclear Information System (INIS)

    Sesana, Alberto; Gair, Jonathan; Berti, Emanuele; Volonteri, Marta

    2011-01-01

    The massive black holes we observe in galaxies today are the natural end-product of a complex evolutionary path, in which black holes seeded in proto-galaxies at high redshift grow through cosmic history via a sequence of mergers and accretion episodes. Electromagnetic observations probe a small subset of the population of massive black holes (namely, those that are active or those that are very close to us), but planned space-based gravitational wave observatories such as the Laser Interferometer Space Antenna (LISA) can measure the parameters of 'electromagnetically invisible' massive black holes out to high redshift. In this paper we introduce a Bayesian framework to analyze the information that can be gathered from a set of such measurements. Our goal is to connect a set of massive black hole binary merger observations to the underlying model of massive black hole formation. In other words, given a set of observed massive black hole coalescences, we assess what information can be extracted about the underlying massive black hole population model. For concreteness we consider ten specific models of massive black hole formation, chosen to probe four important (and largely unconstrained) aspects of the input physics used in structure formation simulations: seed formation, metallicity ''feedback'', accretion efficiency and accretion geometry. For the first time we allow for the possibility of 'model mixing', by drawing the observed population from some combination of the 'pure' models that have been simulated. A Bayesian analysis allows us to recover a posterior probability distribution for the ''mixing parameters'' that characterize the fractions of each model represented in the observed distribution. Our work shows that LISA has enormous potential to probe the underlying physics of structure formation.

  12. The confounding effect of population structure on bayesian skyline plot inferences of demographic history

    DEFF Research Database (Denmark)

    Heller, Rasmus; Chikhi, Lounes; Siegismund, Hans

    2013-01-01

    Many coalescent-based methods aiming to infer the demographic history of populations assume a single, isolated and panmictic population (i.e. a Wright-Fisher model). While this assumption may be reasonable under many conditions, several recent studies have shown that the results can be misleading...... when it is violated. Among the most widely applied demographic inference methods are Bayesian skyline plots (BSPs), which are used across a range of biological fields. Violations of the panmixia assumption are to be expected in many biological systems, but the consequences for skyline plot inferences...... the best scheme for inferring demographic change over a typical time scale. Analyses of data from a structured African buffalo population demonstrate how BSP results can be strengthened by simulations. We recommend that sample selection should be carefully considered in relation to population structure...

  13. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    International Nuclear Information System (INIS)

    Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

    2013-01-01

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  14. Systematics and morphological evolution within the moss family Bryaceae: a comparison between parsimony and Bayesian methods for reconstruction of ancestral character states.

    Science.gov (United States)

    Pedersen, Niklas; Holyoak, David T; Newton, Angela E

    2007-06-01

    The Bryaceae are a large cosmopolitan moss family including genera of significant morphological and taxonomic complexity. Phylogenetic relationships within the Bryaceae were reconstructed based on DNA sequence data from all three genomic compartments. In addition, maximum parsimony and Bayesian inference were employed to reconstruct ancestral character states of 38 morphological plus four habitat characters and eight insertion/deletion events. The recovered phylogenetic patterns are generally in accord with previous phylogenies based on chloroplast DNA sequence data and three major clades are identified. The first clade comprises Bryum bornholmense, B. rubens, B. caespiticium, and Plagiobryum. This corroborates the hypothesis suggested by previous studies that several Bryum species are more closely related to Plagiobryum than to the core Bryum species. The second clade includes Acidodontium, Anomobryum, and Haplodontium, while the third clade contains the core Bryum species plus Imbribryum. Within the latter clade, B. subapiculatum and B. tenuisetum form the sister clade to Imbribryum. Reconstructions of ancestral character states under maximum parsimony and Bayesian inference suggest fourteen morphological synapomorphies for the ingroup and synapomorphies are detected for most clades within the ingroup. Maximum parsimony and Bayesian reconstructions of ancestral character states are mostly congruent although Bayesian inference shows that the posterior probability of ancestral character states may decrease dramatically when node support is taken into account. Bayesian inference also indicates that reconstructions may be ambiguous at internal nodes for highly polymorphic characters.

  15. Reconstructing the invasion history of Heracleum persicum (Apiaceae) into Europe

    Czech Academy of Sciences Publication Activity Database

    Rijal, D. P.; Alm, T.; Jahodová, Šárka; Stenoien, H. K.; Alsos, I. G.

    2015-01-01

    Roč. 24, č. 22 (2015), s. 5522-5543 ISSN 0962-1083 Institutional support: RVO:67985939 Keywords : approximate Bayesian computation * genetic variation * population genetics Subject RIV: EH - Ecology, Behaviour Impact factor: 5.947, year: 2015

  16. Reconstructing consensus Bayesian network structures with application to learning molecular interaction networks

    NARCIS (Netherlands)

    Fröhlich, H.; Klau, G.W.

    2013-01-01

    Bayesian Networks are an established computational approach for data driven network inference. However, experimental data is limited in its availability and corrupted by noise. This leads to an unavoidable uncertainty about the correct network structure. Thus sampling or bootstrap based strategies

  17. Reconstruction of prehistoric pottery use from fatty acid carbon isotope signatures using Bayesian inference

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Eley, Y.; Brabec, Marek; Lucquin, A.; Millard, A.; Craig, O.E.

    2018-01-01

    Roč. 117, March (2018), s. 31-42 ISSN 0146-6380 Institutional support: RVO:67985807 Keywords : Fatty acids * carbon isotopes * pottery use * Bayesian mixing models * FRUITS Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.081, year: 2016

  18. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  19. The resolved star formation history of M51a through successive Bayesian marginalization

    Science.gov (United States)

    Martínez-García, Eric E.; Bruzual, Gustavo; Magris C., Gladis; González-Lópezlira, Rosa A.

    2018-02-01

    We have obtained the time and space-resolved star formation history (SFH) of M51a (NGC 5194) by fitting Galaxy Evolution Explorer (GALEX), Sloan Digital Sky Survey and near-infrared pixel-by-pixel photometry to a comprehensive library of stellar population synthesis models drawn from the Synthetic Spectral Atlas of Galaxies (SSAG). We fit for each space-resolved element (pixel) an independent model where the SFH is averaged in 137 age bins, each one 100 Myr wide. We used the Bayesian Successive Priors (BSP) algorithm to mitigate the bias in the present-day spatial mass distribution. We test BSP with different prior probability distribution functions (PDFs); this exercise suggests that the best prior PDF is the one concordant with the spatial distribution of the stellar mass as inferred from the near-infrared images. We also demonstrate that varying the implicit prior PDF of the SFH in SSAG does not affect the results. By summing the contributions to the global star formation rate of each pixel, at each age bin, we have assembled the resolved SFH of the whole galaxy. According to these results, the star formation rate of M51a was exponentially increasing for the first 10 Gyr after the big bang, and then turned into an exponentially decreasing function until the present day. Superimposed, we find a main burst of star formation at t ≈ 11.9 Gyr after the big bang.

  20. The phylogeographic history of the new world screwworm fly, inferred by approximate bayesian computation analysis.

    Directory of Open Access Journals (Sweden)

    Pablo Fresia

    Full Text Available Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP. The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP. The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests.

  1. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  2. Paleomagnetic reconstruction of the global geomagnetic field evolution during the Matuyama/Brunhes transition: Iterative Bayesian inversion and independent verification

    Science.gov (United States)

    Leonhardt, Roman; Fabian, Karl

    2007-01-01

    The Earth's magnetic field changed its polarity from the last reversed into today's normal state approximately 780 000 years ago. While before and after this so called Matuyama/Brunhes reversal, the Earth magnetic field was essentially an axial dipole, the details of its transitional structure are still largely unknown. Here, a Bayesian inversion method is developed to reconstruct the spherical harmonic expansion of this transitional field from paleomagnetic data. This is achieved by minimizing the total variational power at the core-mantle boundary during the transition under paleomagnetic constraints. The validity of the inversion technique is proved in two ways. First by inverting synthetic data sets from a modeled reversal. Here it is possible to reliably reconstruct the Gauss coefficients even from noisy records. Second by iteratively combining four geographically distributed high quality paleomagnetic records of the Matuyama/Brunhes reversal into a single geometric reversal scenario without assuming an a priori common age model. The obtained spatio-temporal reversal scenario successfully predicts most independent Matuyama/Brunhes transitional records. Therefore, the obtained global reconstruction based on paleomagnetic data invites to compare the inferred transitional field structure with results from numerical geodynamo models regarding the morphology of the transitional field. It is found that radial magnetic flux patches form at the equator and move polewards during the transition. Our model indicates an increase of non-dipolar energy prior to the last reversal and a non-dipolar dominance during the transition. Thus, the character and information of surface geomagnetic field records is strongly site dependent. The reconstruction also offers new answers to the question of existence of preferred longitudinal bands during the transition and to the problem of reversal duration. Different types of directional variations of the surface geomagnetic field

  3. Strontium isotopes and the reconstruction of the Chaco regional system: evaluating uncertainty with Bayesian mixing models.

    Directory of Open Access Journals (Sweden)

    Brandon Lee Drake

    Full Text Available Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated and Bayesian methods (to address uncertainty in geochemical source attribution. It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.

  4. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2009-01-01

    In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...

  5. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2011-01-01

    We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical s...

  6. Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach

    Science.gov (United States)

    Kim, Seyong; Petreczky, Peter; Rothkopf, Alexander

    2016-01-01

    We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with Nf = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χb1) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided.

  7. Bayesian Approach to Spectral Function Reconstruction for Euclidean Quantum Field Theories

    Science.gov (United States)

    Burnier, Yannis; Rothkopf, Alexander

    2013-11-01

    We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33TC.

  8. A Bayesian Supertree Model for Genome-Wide Species Tree Reconstruction

    Science.gov (United States)

    De Oliveira Martins, Leonardo; Mallo, Diego; Posada, David

    2016-01-01

    Current phylogenomic data sets highlight the need for species tree methods able to deal with several sources of gene tree/species tree incongruence. At the same time, we need to make most use of all available data. Most species tree methods deal with single processes of phylogenetic discordance, namely, gene duplication and loss, incomplete lineage sorting (ILS) or horizontal gene transfer. In this manuscript, we address the problem of species tree inference from multilocus, genome-wide data sets regardless of the presence of gene duplication and loss and ILS therefore without the need to identify orthologs or to use a single individual per species. We do this by extending the idea of Maximum Likelihood (ML) supertrees to a hierarchical Bayesian model where several sources of gene tree/species tree disagreement can be accounted for in a modular manner. We implemented this model in a computer program called guenomu whose inputs are posterior distributions of unrooted gene tree topologies for multiple gene families, and whose output is the posterior distribution of rooted species tree topologies. We conducted extensive simulations to evaluate the performance of our approach in comparison with other species tree approaches able to deal with more than one leaf from the same species. Our method ranked best under simulated data sets, in spite of ignoring branch lengths, and performed well on empirical data, as well as being fast enough to analyze relatively large data sets. Our Bayesian supertree method was also very successful in obtaining better estimates of gene trees, by reducing the uncertainty in their distributions. In addition, our results show that under complex simulation scenarios, gene tree parsimony is also a competitive approach once we consider its speed, in contrast to more sophisticated models. PMID:25281847

  9. A Bayesian Supertree Model for Genome-Wide Species Tree Reconstruction.

    Science.gov (United States)

    De Oliveira Martins, Leonardo; Mallo, Diego; Posada, David

    2016-05-01

    Current phylogenomic data sets highlight the need for species tree methods able to deal with several sources of gene tree/species tree incongruence. At the same time, we need to make most use of all available data. Most species tree methods deal with single processes of phylogenetic discordance, namely, gene duplication and loss, incomplete lineage sorting (ILS) or horizontal gene transfer. In this manuscript, we address the problem of species tree inference from multilocus, genome-wide data sets regardless of the presence of gene duplication and loss and ILS therefore without the need to identify orthologs or to use a single individual per species. We do this by extending the idea of Maximum Likelihood (ML) supertrees to a hierarchical Bayesian model where several sources of gene tree/species tree disagreement can be accounted for in a modular manner. We implemented this model in a computer program called guenomu whose inputs are posterior distributions of unrooted gene tree topologies for multiple gene families, and whose output is the posterior distribution of rooted species tree topologies. We conducted extensive simulations to evaluate the performance of our approach in comparison with other species tree approaches able to deal with more than one leaf from the same species. Our method ranked best under simulated data sets, in spite of ignoring branch lengths, and performed well on empirical data, as well as being fast enough to analyze relatively large data sets. Our Bayesian supertree method was also very successful in obtaining better estimates of gene trees, by reducing the uncertainty in their distributions. In addition, our results show that under complex simulation scenarios, gene tree parsimony is also a competitive approach once we consider its speed, in contrast to more sophisticated models. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society of Systematic Biologists.

  10. Mandible reconstruction: History, state of the art and persistent problems.

    Science.gov (United States)

    Ferreira, José J; Zagalo, Carlos M; Oliveira, Marta L; Correia, André M; Reis, Ana R

    2015-06-01

    Mandibular reconstruction has been experiencing an amazing evolution. Several different approaches are used to reconstruct this bone and therefore have a fundamental role in the recovery of oral functions. This review aims to highlight the persistent problems associated with the approaches identified, whether bone grafts or prosthetic devices are used. A brief summary of the historical evolution of the surgical procedures is presented, as well as an insight into possible future pathways. A literature review was conducted from September to December 2012 using the PubMed database. The keyword used was "mandible reconstruction." Articles published in the last three years were included as well as the relevant references from those articles and the "historical articles" were referred. This research resulted in a monograph that this article aims to summarize. Titanium plates, bone grafts, pediculate flaps, free osteomyocutaneous flaps, rapid prototyping, and tissue engineering strategies are some of the identified possibilities. The classical approaches present considerable associated morbidity donor-site-related problems. Research that results in the development of new prosthetics devices is needed. A new prosthetic approach could minimize the identified problems and offer the patients more predictable, affordable, and comfortable solutions. This review, while affirming the evolution and the good results found with the actual approaches, emphasizes the negative aspects that still subsist. Thus, it shows that mandible reconstruction is not a closed issue. On the contrary, it remains as a research field where new findings could have a direct positive impact on patients' life quality. The identification of the persistent problems reveals the characteristics to be considered in a new prosthetic device. This could overcome the current difficulties and result in more comfortable solutions. Medical teams have the responsibility to keep patients informed about the predictable

  11. Mitochondrial phylogeny of the Chrysisignita (Hymenoptera: Chrysididae) species group based on simultaneous Bayesian alignment and phylogeny reconstruction.

    Science.gov (United States)

    Soon, Villu; Saarma, Urmas

    2011-07-01

    The ignita species group within the genus Chrysis includes over 100 cuckoo wasp species, which all lead a parasitic lifestyle and exhibit very similar morphology. The lack of robust, diagnostic morphological characters has hindered phylogenetic reconstructions and contributed to frequent misidentification and inconsistent interpretations of species in this group. Therefore, molecular phylogenetic analysis is the most suitable approach for resolving the phylogeny and taxonomy of this group. We present a well-resolved phylogeny of the Chrysis ignita species group based on mitochondrial sequence data from 41 ingroup and six outgroup taxa. Although our emphasis was on European taxa, we included samples from most of the distribution range of the C. ignita species group to test for monophyly. We used a continuous mitochondrial DNA sequence consisting of 16S rRNA, tRNA(Val), 12S rRNA and ND4. The location of the ND4 gene at the 3' end of this continuous sequence, following 12S rRNA, represents a novel mitochondrial gene arrangement for insects. Due to difficulties in aligning rRNA genes, two different Bayesian approaches were employed to reconstruct phylogeny: (1) using a reduced data matrix including only those positions that could be aligned with confidence; or (2) using the full sequence dataset while estimating alignment and phylogeny simultaneously. In addition maximum-parsimony and maximum-likelihood analyses were performed to test the robustness of the Bayesian approaches. Although all approaches yielded trees with similar topology, considerably more nodes were resolved with analyses using the full data matrix. Phylogenetic analysis supported the monophyly of the C. ignita species group and divided its species into well-supported clades. The resultant phylogeny was only partly in accordance with published subgroupings based on morphology. Our results suggest that several taxa currently treated as subspecies or names treated as synonyms may in fact constitute

  12. Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach

    International Nuclear Information System (INIS)

    Kim, Seyong; Petreczky, Peter; Rothkopf, Alexander

    2016-01-01

    We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with N f = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV < T < 249MeV and deploys lattice regularized non-relativistic QCD (NRQCD) effective field theory to capture the physics of heavy quark bound states immersed therein. The spectral functions of the 3 S 1 (ϒ) and 3 P 1 (χ b1 ) bottomonium states are extracted from Euclidean time Monte Carlo simulations using a novel Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χ b1 ) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided

  13. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  14. Does Smoking History Confer a Higher Risk for Reconstructive Complications in Nipple-Sparing Mastectomy?

    Science.gov (United States)

    Frey, Jordan D; Alperovich, Michael; Levine, Jamie P; Choi, Mihye; Karp, Nolan S

    2017-07-01

    History of smoking has been implicated as a risk factor for reconstructive complications in nipple-sparing mastectomy (NSM), however there have been no direct analyses of outcomes in smokers and nonsmokers. All patients undergoing NSM at New York University Langone Medical Center from 2006 to 2014 were identified. Outcomes were compared for those with and without a smoking history and stratified by pack-year smoking history and years-to-quitting (YTQ). A total of 543 nipple-sparing mastectomies were performed from 2006 to 2014 with a total of 49 in patients with a history of smoking. Reconstructive outcomes in NSM between those with and without a smoking history were equivalent. Those with a smoking history were not significantly more likely to have mastectomy flap necrosis (p = 0.6251), partial (p = 0.8564), or complete (p = 0.3365) nipple-areola complex (NAC) necrosis. Likewise, active smokers alone did not have a higher risk of complications compared to nonsmokers or those with smoking history. Comparing nonsmokers and those with a less or greater than 10 pack-year smoking history, those with a > 10 pack-year history had significantly more complete NAC necrosis (p = 0.0114, smoking history or >5 YTQ prior to NSM were equivalent to those without a smoking history. We demonstrate that NSM may be safely offered to those with a smoking history although a > 10 pack-year smoking history or <5 YTQ prior to NSM may impart a higher risk of reconstructive complications, including complete NAC necrosis. © 2017 Wiley Periodicals, Inc.

  15. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Science.gov (United States)

    Tilley, Alexander; López-Angarita, Juliana; Turner, John R

    2013-01-01

    The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15)N and (13)C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15)N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15)N values and greater δ(13)C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15)N ≈ 2.7‰ and Δ(13)C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.

  16. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Directory of Open Access Journals (Sweden)

    Alexander Tilley

    Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.

  17. An experimental evaluation of fire history reconstruction using dendrochronology in white oak (Quercus alba)

    Science.gov (United States)

    Ryan W. McEwan; Todd F. Hutchinson; Robert D. Ford; Brian C. McCarthy

    2007-01-01

    Dendrochronological analysis of fire scars on tree cross sections has been critically important for understanding historical fire regimes and has influenced forest management practices. Despite its value as a tool for understanding historical ecosystems, tree-ring-based fire history reconstruction has rarely been experimentally evaluated. To examine the efficacy of...

  18. Reconstructing the trophic history of the Black Sea shelf

    Science.gov (United States)

    Yunev, Oleg; Velikova, Violeta; Carstensen, Jacob

    2017-11-01

    In the last 50 years the Black Sea has undergone large changes driven by increasing anthropogenic pressures. We estimated the integrated annual primary production (APP) for different shelf regions during the early eutrophication phase (1963-1976) using chlorophyll a and winter nitrate concentrations as proxy observations of primary production to describe its seasonal variation. For comparison, APP was estimated during the period when eutrophication peaked (1985-1992). In the early eutrophication period APP was estimated at 64-89 g C m-2 yr-1 for most part of the shelf, except the shelf part influenced by the Danube River (the shallow waters off the Romanian and Bulgarian coasts) where APP was ∼126 g C m-2 yr-1. In these two different shelf parts, APP increased to 138-190 and 266-318 g C m-2 yr-1 during the peak eutrophication period. These spatial differences are attributed to the large nutrient inputs from the Danube River. The APP estimates provide new insight into the eutrophication history of the Black Sea shelf, documenting stronger signs of eutrophiction than observed in other enclosed seas such as the Baltic Sea. Since the peak eutrophication period APP is estimated to have decreased by approximately 15-20%.

  19. Reconstructing Roma history from genome-wide data.

    Directory of Open Access Journals (Sweden)

    Priya Moorjani

    Full Text Available The Roma people, living throughout Europe and West Asia, are a diverse population linked by the Romani language and culture. Previous linguistic and genetic studies have suggested that the Roma migrated into Europe from South Asia about 1,000-1,500 years ago. Genetic inferences about Roma history have mostly focused on the Y chromosome and mitochondrial DNA. To explore what additional information can be learned from genome-wide data, we analyzed data from six Roma groups that we genotyped at hundreds of thousands of single nucleotide polymorphisms (SNPs. We estimate that the Roma harbor about 80% West Eurasian ancestry-derived from a combination of European and South Asian sources-and that the date of admixture of South Asian and European ancestry was about 850 years before present. We provide evidence for Eastern Europe being a major source of European ancestry, and North-west India being a major source of the South Asian ancestry in the Roma. By computing allele sharing as a measure of linkage disequilibrium, we estimate that the migration of Roma out of the Indian subcontinent was accompanied by a severe founder event, which appears to have been followed by a major demographic expansion after the arrival in Europe.

  20. Using serological studies to reconstruct the history of bluetongue epidemic in French cattle under successive vaccination campaigns.

    Science.gov (United States)

    Courtejoie, Noémie; Salje, Henrik; Durand, Benoît; Zanella, Gina; Cauchemez, Simon

    2018-05-17

    Bluetongue virus is a vector-borne pathogen affecting ruminants that has caused major epidemics in France. Reconstructing the history of bluetongue in French cattle under control strategies such as vaccination has been hampered by the high level of sub-clinical infection, incomplete case data and poor understanding of vaccine uptake over time and space. To tackle these challenges, we used three age-structured serological surveys carried out in cattle (N = 22,342) from ten administrative subdivisions called departments. We fitted catalytic models within a Bayesian MCMC framework to reconstruct the force of seroconversion from infection or vaccination, and the population-level susceptibility per semester between 2007 and 2016. In the departments of the study area, we estimated that 36% of cattle had been infected prior to vaccine rollout that became compulsory from July 2008. The last outbreak case was notified in December 2009, at which time 83% of the animals were seropositive, under the cumulative effect of vaccination and infection. The probability of seroconversion per semester dropped below 10% after 2010 when vaccination became optional. Vaccine uptake was smaller during the 2012 campaign than during the one in 2011, with strong regional contrasts. Eighty four percent of cattle were susceptible when bluetongue re-emerged in 2015. Thus, serological surveys can be used to estimate vaccine uptake and the magnitude of infection, the relative effect of which can sometimes be inferred using prior knowledge on reported incidence and vaccination dates. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Equivalent thermal history reconstruction from a partially crystallized glass-ceramic sensor array

    Science.gov (United States)

    Heeg, Bauke

    2015-11-01

    The basic concept of a thermal history sensor is that it records the accumulated exposure to some unknown, typically varying temperature profile for a certain amount of time. Such a sensor is considered to be capable of measuring the duration of several (N) temperature intervals. For this purpose, the sensor deploys multiple (M) sensing elements, each with different temperature sensitivity. At the end of some thermal exposure for a known period of time, the sensor array is read-out and an estimate is made of the set of N durations of the different temperature ranges. A potential implementation of such a sensor was pioneered by Fair et al. [Sens. Actuators, A 141, 245 (2008)], based on glass-ceramic materials with different temperature-dependent crystallization dynamics. In their work, it was demonstrated that an array of sensor elements can be made sensitive to slight differences in temperature history. Further, a forward crystallization model was used to simulate the variations in sensor array response to differences in the temperature history. The current paper focusses on the inverse aspect of temperature history reconstruction from a hypothetical sensor array output. The goal of such a reconstruction is to find an equivalent thermal history that is the closest representation of the true thermal history, i.e., the durations of a set of temperature intervals that result in a set of fractional crystallization values which is closest to the one resulting from the true thermal history. One particular useful simplification in both the sensor model as well as in its practical implementation is the omission of nucleation effects. In that case, least squares models can be used to approximate the sensor response and make reconstruction estimates. Even with this simplification, sensor noise can have a destabilizing effect on possible reconstruction solutions, which is evaluated using simulations. Both regularization and non-negativity constrained least squares

  2. A STUDY ON DYNAMIC LOAD HISTORY RECONSTRUCTION USING PSEUDO-INVERSE METHODS

    OpenAIRE

    Santos, Ariane Rebelato Silva dos; Marczak, Rogério José

    2017-01-01

    Considering that the vibratory forces generally cannot be measured directly at the interface of two bodies, an inverse method is studied in the present work to recover the load history in such cases. The proposed technique attempts to reconstruct the dynamic loads history by using a frequency domain analysis and Moore-Penrose pseudo-inverses of the frequency response function (FRF) of the system. The methodology consists in applying discrete dynamic loads on a finite element model in the time...

  3. Reconstruction of the early invasion history of the quagga mussel (Dreissena rostriformis bugensis) in Western Europe

    OpenAIRE

    Heiler, Katharina; Vaate, Abraham bij de; Ekschmitt, Klemens; Oheimb, Parm von; Albrecht, Christian; Wilke, Thomas

    2013-01-01

    The recent introduction of the quagga mussel into Western European freshwaters marked the beginning of one of the most successful biological invasions during the past years in this region. However, the spatial and temporal origin of the first invasive population(s) in Western Europe as well as subsequent spreading routes still remain under discussion. In this study, we therefore aim at reconstructing the early invasion history of the quagga mussel in Western Europe based on an age-corrected t...

  4. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Science.gov (United States)

    McNally, Kevin; Cotton, Richard; Cocker, John; Jones, Kate; Bartels, Mike; Rick, David; Price, Paul; Loizou, George

    2012-01-01

    There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure. PMID:22719759

  5. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    Directory of Open Access Journals (Sweden)

    Kevin McNally

    2012-01-01

    Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.

  6. Multichannel Signals Reconstruction Based on Tunable Q-Factor Wavelet Transform-Morphological Component Analysis and Sparse Bayesian Iteration for Rotating Machines

    Directory of Open Access Journals (Sweden)

    Qing Li

    2018-04-01

    Full Text Available High-speed remote transmission and large-capacity data storage are difficult issues in signals acquisition of rotating machines condition monitoring. To address these concerns, a novel multichannel signals reconstruction approach based on tunable Q-factor wavelet transform-morphological component analysis (TQWT-MCA and sparse Bayesian iteration algorithm combined with step-impulse dictionary is proposed under the frame of compressed sensing (CS. To begin with, to prevent the periodical impulses loss and effectively separate periodical impulses from the external noise and additive interference components, the TQWT-MCA method is introduced to divide the raw vibration signal into low-resonance component (LRC, i.e., periodical impulses and high-resonance component (HRC, thus, the periodical impulses are preserved effectively. Then, according to the amplitude range of generated LRC, the step-impulse dictionary atom is designed to match the physical structure of periodical impulses. Furthermore, the periodical impulses and HRC are reconstructed by the sparse Bayesian iteration combined with step-impulse dictionary, respectively, finally, the final reconstructed raw signals are obtained by adding the LRC and HRC, meanwhile, the fidelity of the final reconstructed signals is tested by the envelop spectrum and error analysis, respectively. In this work, the proposed algorithm is applied to simulated signal and engineering multichannel signals of a gearbox with multiple faults. Experimental results demonstrate that the proposed approach significantly improves the reconstructive accuracy compared with the state-of-the-art methods such as non-convex Lq (q = 0.5 regularization, spatiotemporal sparse Bayesian learning (SSBL and L1-norm, etc. Additionally, the processing time, i.e., speed of storage and transmission has increased dramatically, more importantly, the fault characteristics of the gearbox with multiple faults are detected and saved, i.e., the

  7. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.

    Directory of Open Access Journals (Sweden)

    Daniel Magee

    2017-02-01

    Full Text Available Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014-15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation.

  8. msBayes: Pipeline for testing comparative phylogeographic histories using hierarchical approximate Bayesian computation

    Directory of Open Access Journals (Sweden)

    Takebayashi Naoki

    2007-07-01

    Full Text Available Abstract Background Although testing for simultaneous divergence (vicariance across different population-pairs that span the same barrier to gene flow is of central importance to evolutionary biology, researchers often equate the gene tree and population/species tree thereby ignoring stochastic coalescent variance in their conclusions of temporal incongruence. In contrast to other available phylogeographic software packages, msBayes is the only one that analyses data from multiple species/population pairs under a hierarchical model. Results msBayes employs approximate Bayesian computation (ABC under a hierarchical coalescent model to test for simultaneous divergence (TSD in multiple co-distributed population-pairs. Simultaneous isolation is tested by estimating three hyper-parameters that characterize the degree of variability in divergence times across co-distributed population pairs while allowing for variation in various within population-pair demographic parameters (sub-parameters that can affect the coalescent. msBayes is a software package consisting of several C and R programs that are run with a Perl "front-end". Conclusion The method reasonably distinguishes simultaneous isolation from temporal incongruence in the divergence of co-distributed population pairs, even with sparse sampling of individuals. Because the estimate step is decoupled from the simulation step, one can rapidly evaluate different ABC acceptance/rejection conditions and the choice of summary statistics. Given the complex and idiosyncratic nature of testing multi-species biogeographic hypotheses, we envision msBayes as a powerful and flexible tool for tackling a wide array of difficult research questions that use population genetic data from multiple co-distributed species. The msBayes pipeline is available for download at http://msbayes.sourceforge.net/ under an open source license (GNU Public License. The msBayes pipeline is comprised of several C and R programs that

  9. Multilocus phylogeny reconstruction: new insights into the evolutionary history of the genus Petunia.

    Science.gov (United States)

    Reck-Kortmann, Maikel; Silva-Arias, Gustavo Adolfo; Segatto, Ana Lúcia Anversa; Mäder, Geraldo; Bonatto, Sandro Luis; de Freitas, Loreta Brandão

    2014-12-01

    The phylogeny of Petunia species has been difficult to resolve, primarily due to the recent diversification of the genus. Several studies have included molecular data in phylogenetic reconstructions of this genus, but all of them have failed to include all taxa and/or analyzed few genetic markers. In the present study, we employed the most inclusive genetic and taxonomic datasets for the genus, aiming to reconstruct the evolutionary history of Petunia based on molecular phylogeny, biogeographic distribution, and character evolution. We included all 20 Petunia morphological species or subspecies in these analyses. Based on nine nuclear and five plastid DNA markers, our phylogenetic analysis reinforces the monophyly of the genus Petunia and supports the hypothesis that the basal divergence is more related to the differentiation of corolla tube length, whereas the geographic distribution of species is more related to divergences within these main clades. Ancestral area reconstructions suggest the Pampas region as the area of origin and earliest divergence in Petunia. The state reconstructions suggest that the ancestor of Petunia might have had a short corolla tube and a bee pollination floral syndrome. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Reconstruction of tritium release history from contaminated groundwater using tree ring analysis

    International Nuclear Information System (INIS)

    Kalin, R.M.; Murphy, C.E. Jr.; Hall, G.

    1995-01-01

    The history of tritium releases to the groundwater from buried waste was reconstructed through dendrochronology. Wood from dated tree rings was sectioned from a cross-section of a tree that was thought to tap the groundwater. Cellulose was chemically separated from the wood. The cellulose was combusted and the water of combustion collected for liquid scintillation counting. The tritium concentration in the rings rose rapidly after 1972 which was prior to the first measurements made in this area. Trends in the tritium concentration of water outcropping to the surface are similar to the trends in tritium concentration in tree rings. 14 refs., 3 figs

  11. Bayesian prediction and adaptive sampling algorithms for mobile sensor networks online environmental field reconstruction in space and time

    CERN Document Server

    Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata

    2016-01-01

    This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...

  12. Isotopic reconstruction of the weaning process in the archaeological population of Canímar Abajo, Cuba: A Bayesian probability mixing model approach.

    Directory of Open Access Journals (Sweden)

    Yadira Chinique de Armas

    Full Text Available The general lack of well-preserved juvenile skeletal remains from Caribbean archaeological sites has, in the past, prevented evaluations of juvenile dietary changes. Canímar Abajo (Cuba, with a large number of well-preserved juvenile and adult skeletal remains, provided a unique opportunity to fully assess juvenile paleodiets from an ancient Caribbean population. Ages for the start and the end of weaning and possible food sources used for weaning were inferred by combining the results of two Bayesian probability models that help to reduce some of the uncertainties inherent to bone collagen isotope based paleodiet reconstructions. Bone collagen (31 juveniles, 18 adult females was used for carbon and nitrogen isotope analyses. The isotope results were assessed using two Bayesian probability models: Weaning Ages Reconstruction with Nitrogen isotopes and Stable Isotope Analyses in R. Breast milk seems to have been the most important protein source until two years of age with some supplementary food such as tropical fruits and root cultigens likely introduced earlier. After two, juvenile diets were likely continuously supplemented by starch rich foods such as root cultigens and legumes. By the age of three, the model results suggest that the weaning process was completed. Additional indications suggest that animal marine/riverine protein and maize, while part of the Canímar Abajo female diets, were likely not used to supplement juvenile diets. The combined use of both models here provided a more complete assessment of the weaning process for an ancient Caribbean population, indicating not only the start and end ages of weaning but also the relative importance of different food sources for different age juveniles.

  13. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  14. Reconstructing the Evolutionary History of Paralogous APETALA1/FRUITFULL-Like Genes in Grasses (Poaceae)

    Science.gov (United States)

    Preston, Jill C.; Kellogg, Elizabeth A.

    2006-01-01

    Gene duplication is an important mechanism for the generation of evolutionary novelty. Paralogous genes that are not silenced may evolve new functions (neofunctionalization) that will alter the developmental outcome of preexisting genetic pathways, partition ancestral functions (subfunctionalization) into divergent developmental modules, or function redundantly. Functional divergence can occur by changes in the spatio-temporal patterns of gene expression and/or by changes in the activities of their protein products. We reconstructed the evolutionary history of two paralogous monocot MADS-box transcription factors, FUL1 and FUL2, and determined the evolution of sequence and gene expression in grass AP1/FUL-like genes. Monocot AP1/FUL-like genes duplicated at the base of Poaceae and codon substitutions occurred under relaxed selection mostly along the branch leading to FUL2. Following the duplication, FUL1 was apparently lost from early diverging taxa, a pattern consistent with major changes in grass floral morphology. Overlapping gene expression patterns in leaves and spikelets indicate that FUL1 and FUL2 probably share some redundant functions, but that FUL2 may have become temporally restricted under partial subfunctionalization to particular stages of floret development. These data have allowed us to reconstruct the history of AP1/FUL-like genes in Poaceae and to hypothesize a role for this gene duplication in the evolution of the grass spikelet. PMID:16816429

  15. An Object-Based Approach for Fire History Reconstruction by Using Three Generations of Landsat Sensors

    Directory of Open Access Journals (Sweden)

    Thomas Katagis

    2014-06-01

    Full Text Available In this study, the capability of geographic object-based image analysis (GEOBIA in the reconstruction of the recent fire history of a typical Mediterranean area was investigated. More specifically, a semi-automated GEOBIA procedure was developed and tested on archived and newly acquired Landsat Multispectral Scanner (MSS, Thematic Mapper (TM, and Operational Land Imager (OLI images in order to accurately map burned areas in the Mediterranean island of Thasos. The developed GEOBIA ruleset was built with the use of the TM image and then applied to the other two images. This process of transferring the ruleset did not require substantial adjustments or any replacement of the initially selected features used for the classification, thus, displaying reduced complexity in processing the images. As a result, burned area maps of very high accuracy (over 94% overall were produced. In addition to the standard error matrix, the employment of additional measures of agreement between the produced maps and the reference data revealed that “spatial misplacement” was the main source of classification error. It can be concluded that the proposed approach can be potentially used for reconstructing the recent (40-year fire history in the Mediterranean, based on extended time series of Landsat or similar data.

  16. Bayesian Penalized Likelihood Image Reconstruction (Q.Clear) in 82Rb Cardiac PET: Impact of Count Statistics

    DEFF Research Database (Denmark)

    Christensen, Nana Louise; Tolbod, Lars Poulsen

    PET scans. 3) Static and dynamic images from a set of 7 patients (BSA: 1.6-2.2 m2) referred for 82Rb cardiac PET was analyzed using a range of beta factors. Results were compared to the institution’s standard clinical practice reconstruction protocol. All scans were performed on GE DMI Digital......Aim: Q.Clear reconstruction is expected to improve detection of perfusion defects in cardiac PET due to the high degree of image convergence and effective noise suppression. However, 82Rb (T½=76s) possess a special problem, since count statistics vary significantly not only between patients...... statistics using a cardiac PET phantom as well as a selection of clinical patients referred for 82Rb cardiac PET. Methods: The study consistent of 3 parts: 1) A thorax-cardiac phantom was scanned for 10 minutes after injection of 1110 MBq 82Rb. Frames at 3 different times after infusion were reconstructed...

  17. The history of the Society of Urodynamics, Female Pelvic Medicine, and Urogenital Reconstruction.

    Science.gov (United States)

    Weissbart, Steven J; Zimmern, Philippe E; Nitti, Victor W; Lemack, Gary E; Kobashi, Kathleen C; Vasavada, Sandip P; Wein, Alan J

    2018-03-25

    To review the history of the Society of Urodynamics, Female Pelvic Medicine and Urogenital Reconstruction (SUFU). We reviewed Society meeting minutes, contacted all living former Society presidents, searched the William P. Didusch Center for Urology History records, and asked Society members to share their important Society experiences in order to gather important historical information about the Society. The Society initially formed as the Urodynamics Society in 1969 in the backdrop of a growing passion for scientific research in the country after World War II ended. Since then, Society meetings have provided a pivotal forum for the advancement of science in lower urinary tract dysfunction. Meetings occurred annually until 2004, when the meeting schedule increased to biannual. The journal, Neurourology and Urodynamics, became the official journal of the Society in 2005. SUFU has authored important guidelines on urodynamics (2012), non-neurogenic overactive bladder (2012), and stress urinary incontinence (2017) and has shared important collaborations with other societies, including the American Urological Association (AUA), the International Continence Society (ICS), and the International Society of Pelvic Neuromodulation (ISPiN). SUFU has also been instrumental in trainee education and helped to establish formal fellowship training in the field in addition to holding a yearly educational meeting for urology residents. The Society has been led by 21 presidents throughout its history. Throughout the Society's near half-century long existence, the Society has fostered research, published guidelines, and educated trainees in order to improve the care of individuals suffering from lower urinary tract dysfunction. © 2018 Wiley Periodicals, Inc.

  18. Backlash against American psychology: an indigenous reconstruction of the history of German critical psychology.

    Science.gov (United States)

    Teo, Thomas

    2013-02-01

    After suggesting that all psychologies contain indigenous qualities and discussing differences and commonalities between German and North American historiographies of psychology, an indigenous reconstruction of German critical psychology is applied. It is argued that German critical psychology can be understood as a backlash against American psychology, as a response to the Americanization of German psychology after WWII, on the background of the history of German psychology, the academic impact of the Cold War, and the trajectory of personal biographies and institutions. Using an intellectual-historical perspective, it is shown how and which indigenous dimensions played a role in the development of German critical psychology as well as the limitations to such an historical approach. Expanding from German critical psychology, the role of the critique of American psychology in various contexts around the globe is discussed in order to emphasize the relevance of indigenous historical research.

  19. Reconstructions of human history by mapping dental markers in living Eurasian populations

    Science.gov (United States)

    Kashibadze, Vera F.; Nasonova, Olga G.; Nasonov, Dmitry S.

    2013-01-01

    Using advances in gene geography and anthropophenetics, the phenogeographical method for anthropological research was initiated and developed using dental data. Statistical and cartographical analyses are provided for 498 living Eurasian populations. Mapping principal components supplied evidence for the phene pool structure in Eurasian populations, and for reconstructions of Homo sapiens history on the continent. Longitudinal variability seems to be the most important regularity revealed by principal components analysis (PCA) and mapping, indicating the division of the whole area into western and eastern main provinces. So, the most ancient scenario in the history of Eurasian populations developed from two perspective different groups: a western group related to ancient populations of West Asia and an eastern one rooted in ancestry in South and/or East Asia. In spite of the enormous territory and the revealed divergence, the populations of the continent have undergone wide scale and intensive timeespace interaction. Many details in the revealed landscapes are background to different historical events. Migrations and assimilation are two essential phenomena in Eurasian history: the widespread of the western combination through the whole continent to the Pacific coastline and the movement of the paradoxical combinations of eastern and western markers from South or Central Asia to the east and west. Taking into account that no additional eastern combinations in the total variation in Asian groups have been found, but that mixed or western markers' sets and that eastern dental characteristics are traced in Asia since Homo erectus, the assumption is made in favour of the hetero-level assimilation in the eastern province and of net-like evolution of H. sapiens.

  20. The Bayesian reconstruction of the in-medium heavy quark potential from lattice QCD and its stability

    Science.gov (United States)

    Burnier, Yannis; Kaczmarek, Olaf; Rothkopf, Alexander

    2016-01-01

    We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.

  1. Reconstructing the past outburst history of Eta Carinae from WFPC2 proper motions

    Science.gov (United States)

    Smith, Nathan

    2016-10-01

    The HST archive contains multiple epochs of WFPC2 images of the nebula around Eta Carinae taken over a 15-year timespan, although only the earliest few years of data have been analyzed and published. The fact that all these images were taken with the same instrument, with the same pixel sampling and field distortion, makes them an invaluable resource for accurately measuring the expanding ejecta. The goal of a previously accepted AR proposal was to analyze the full set of appropriate continuum-filter HST images to place precise constraints on the avereage ejection date of the Homunculus Nebula; this analysis is now complete (Smith et al 2016) and the nebula appears to have been ejected in the second half of 1847. Here we propose to continue this project by constraining the motion of the more extended and much older Outer Ejecta around Eta Carinae. Older material outside the main bipolar nebula traces previous major outbursts of the star with no recorded historical observations. We propose an ambitious reduction and analysis of the complete WFPC2 imaging dataset of Eta Car. These data can reconstruct its violent mass-loss history over the past thousand years. We have already started this by analyzing two epochs of ACS F658N images, and astonishingly, these data suggested two previous eruptions in the 13th and 15th centuries assuming ballistic motion. WFPC2 images will extend the baseline by 10 yr, and critically, more than 2 epochs allow us to measure any deceleration in the ejecta. We will also analyze Doppler shifts in ground-based spectra in order to reconstruct the 3D geometry of past mass ejection. This AR proposal will fund the final year of a PhD thesis.

  2. RECONSTRUCTING THE PHOTOMETRIC LIGHT CURVES OF EARTH AS A PLANET ALONG ITS HISTORY

    International Nuclear Information System (INIS)

    Sanromá, E.; Pallé, E.

    2012-01-01

    By utilizing satellite-based estimations of the distribution of clouds, we have studied Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation, and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the landmass distributions were different from today's. With this information, we have been able to simulate the globally integrated photometric variability of the planet at these epochs. We find that our simple model reproduces well the observed cloud distribution and albedo variability of the modern Earth. Moreover, the model suggests that the photometric variability of the Earth was probably much larger in past epochs. This enhanced photometric variability could improve the chances for the difficult determination of the rotational period and the identification of continental landmasses for a distant planets.

  3. Reconstructing the complex evolutionary history of mobile plasmids in red algal genomes

    Science.gov (United States)

    Lee, JunMo; Kim, Kyeong Mi; Yang, Eun Chan; Miller, Kathy Ann; Boo, Sung Min; Bhattacharya, Debashish; Yoon, Hwan Su

    2016-01-01

    The integration of foreign DNA into algal and plant plastid genomes is a rare event, with only a few known examples of horizontal gene transfer (HGT). Plasmids, which are well-studied drivers of HGT in prokaryotes, have been reported previously in red algae (Rhodophyta). However, the distribution of these mobile DNA elements and their sites of integration into the plastid (ptDNA), mitochondrial (mtDNA), and nuclear genomes of Rhodophyta remain unknown. Here we reconstructed the complex evolutionary history of plasmid-derived DNAs in red algae. Comparative analysis of 21 rhodophyte ptDNAs, including new genome data for 5 species, turned up 22 plasmid-derived open reading frames (ORFs) that showed syntenic and copy number variation among species, but were conserved within different individuals in three lineages. Several plasmid-derived homologs were found not only in ptDNA but also in mtDNA and in the nuclear genome of green plants, stramenopiles, and rhizarians. Phylogenetic and plasmid-derived ORF analyses showed that the majority of plasmid DNAs originated within red algae, whereas others were derived from cyanobacteria, other bacteria, and viruses. Our results elucidate the evolution of plasmid DNAs in red algae and suggest that they spread as parasitic genetic elements. This hypothesis is consistent with their sporadic distribution within Rhodophyta. PMID:27030297

  4. Bayesian analysis in plant pathology.

    Science.gov (United States)

    Mila, A L; Carriquiry, A L

    2004-09-01

    ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.

  5. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  6. Source Reconstruction of Brain Potentials Using Bayesian Model Averaging to Analyze Face Intra-Domain vs. Face-Occupation Cross-Domain Processing.

    Science.gov (United States)

    Olivares, Ela I; Lage-Castellanos, Agustín; Bobes, María A; Iglesias, Jaime

    2018-01-01

    We investigated the neural correlates of the access to and retrieval of face structure information in contrast to those concerning the access to and retrieval of person-related verbal information, triggered by faces. We experimentally induced stimulus familiarity via a systematic learning procedure including faces with and without associated verbal information. Then, we recorded event-related potentials (ERPs) in both intra-domain (face-feature) and cross-domain (face-occupation) matching tasks while N400-like responses were elicited by incorrect eyes-eyebrows completions and occupations, respectively. A novel Bayesian source reconstruction approach plus conjunction analysis of group effects revealed that in both cases the generated N170s were of similar amplitude but had different neural origin. Thus, whereas the N170 of faces was associated predominantly to right fusiform and occipital regions (the so-called "Fusiform Face Area", "FFA" and "Occipital Face Area", "OFA", respectively), the N170 of occupations was associated to a bilateral very posterior activity, suggestive of basic perceptual processes. Importantly, the right-sided perceptual P200 and the face-related N250 were evoked exclusively in the intra-domain task, with sources in OFA and extensively in the fusiform region, respectively. Regarding later latencies, the intra-domain N400 seemed to be generated in right posterior brain regions encompassing mainly OFA and, to some extent, the FFA, likely reflecting neural operations triggered by structural incongruities. In turn, the cross-domain N400 was related to more anterior left-sided fusiform and temporal inferior sources, paralleling those described previously for the classic verbal N400. These results support the existence of differentiated neural streams for face structure and person-related verbal processing triggered by faces, which can be activated differentially according to specific task demands.

  7. Source Reconstruction of Brain Potentials Using Bayesian Model Averaging to Analyze Face Intra-Domain vs. Face-Occupation Cross-Domain Processing

    Directory of Open Access Journals (Sweden)

    Ela I. Olivares

    2018-03-01

    Full Text Available We investigated the neural correlates of the access to and retrieval of face structure information in contrast to those concerning the access to and retrieval of person-related verbal information, triggered by faces. We experimentally induced stimulus familiarity via a systematic learning procedure including faces with and without associated verbal information. Then, we recorded event-related potentials (ERPs in both intra-domain (face-feature and cross-domain (face-occupation matching tasks while N400-like responses were elicited by incorrect eyes-eyebrows completions and occupations, respectively. A novel Bayesian source reconstruction approach plus conjunction analysis of group effects revealed that in both cases the generated N170s were of similar amplitude but had different neural origin. Thus, whereas the N170 of faces was associated predominantly to right fusiform and occipital regions (the so-called “Fusiform Face Area”, “FFA” and “Occipital Face Area”, “OFA”, respectively, the N170 of occupations was associated to a bilateral very posterior activity, suggestive of basic perceptual processes. Importantly, the right-sided perceptual P200 and the face-related N250 were evoked exclusively in the intra-domain task, with sources in OFA and extensively in the fusiform region, respectively. Regarding later latencies, the intra-domain N400 seemed to be generated in right posterior brain regions encompassing mainly OFA and, to some extent, the FFA, likely reflecting neural operations triggered by structural incongruities. In turn, the cross-domain N400 was related to more anterior left-sided fusiform and temporal inferior sources, paralleling those described previously for the classic verbal N400. These results support the existence of differentiated neural streams for face structure and person-related verbal processing triggered by faces, which can be activated differentially according to specific task demands.

  8. Reconstruction of the building history of the Demidovs’ estate “Almazovo” situated near Moscow

    Directory of Open Access Journals (Sweden)

    Aksenova Irina Vasil’evna

    2014-03-01

    Full Text Available The currency of the topic covered in the article is not only the necessity of national cultural traditions revival, but also the possibility of applying the restored historical objects in modern life as multifunctional cultural and touristic complexes. At present, this is one of the most prospective tendencies in tourism, entertainment industry and educational programmes. The revival of historical estates and cultural traditions is nowadays insufficiently used but inexhaustible source for economical and cultural development of Russian regions. Attracting investments allow preserving ancient buildings in future. The Demidovs’ estate “Sergievskaya dacha” in Almazovo (belonged to the Demidovs, the Ural owners of mines and metallurgical works is an object of historic and cultural interest of Federal significance and it is of great scientific, educational and architectural value. To date the published information about the estate is laconic and sometimes contradictive. The results of historic and architectural researches based upon detailed study of literary materials and especially archives are offered in the article. All building stages of the estate are considered. Unique unpublished drafts of demolished and not erected (because of a disease of the owner buildings and elements of landscape architecture, which form an entity of the whole complex, have been discovered by the author. The scientific importance of researches carried out by the author consists of the possibility (to the great degree of trustworthiness to reconstruct the building history of the whole estate complex. The volume of the obtained information allows to speak of the possibility of the estate restoration and to work out the project for its new contemporary usage as a museum of the noble family way of life. This will encourage the development of tourism in the region and draw the attraction of investments in order to preserve the estate.

  9. Reconstruction of fire history of the Yukon-Kuskokwim Delta, Alaska

    Science.gov (United States)

    Sae-lim, J.; Mann, P. J.; Russell, J. M.; Natali, S.; Vachula, R. S.; Schade, J. D.; Holmes, R. M.

    2017-12-01

    Wildfire is an important disturbance in Arctic ecosystems and can cause abrupt perturbations in global carbon cycling and atmospheric chemistry. Over the next few decades, arctic fire frequency, intensity and extent is projected to increase due to anthropogenic climate change, as regional air temperatures are increasing at more than twice the global average. In order to more accurately predict the anthropogenic impacts of climate change on tundra fire regimes, it is critical to have detailed knowledge of the natural frequency and extent of past wildfires. However, reliable historical records only extend back a few hundred years, whereas climatic shifts have affected fire regimes for thousands of years. In this work we analyzed a lake sediment core collected from the Yukon-Kuskokwim (YK) Delta, Alaska, a region that has recently experienced fire and is predicted to see increasing fire frequency in the near future. Our primary lake site is situated adjacent to recent burned areas, providing a `calibration' point and also attesting to the sensitivity of the sites. We used charcoal counts alongside geochemical measurements (C:N, 13C, 15N, 210Pb, X-ray fluorescence analyses of elemental chemistry) to reconstruct past fire history and ecosystem responses during the late Holocene. Average C (%C) and N concentrations (%N) in the core were 8.10 ±0.74% and 0.48 ±0.05%, resulting in C:N ratios of 16.80 ±0.74. The values are generally stable, with no obvious trend in C, N or C:N with depth; however, fluctuations in sediment composition in other nearby lake sediment cores possibly suggests shifts in lake conditions (oxic, anoxic) over time that might result from paleofires. This study provides a baseline for estimated fire return intervals in the YK Delta that will support more accurate projections of tundra fire frequencies under a changing climate.

  10. RECONSTRUCTING THE SOLAR WIND FROM ITS EARLY HISTORY TO CURRENT EPOCH

    Energy Technology Data Exchange (ETDEWEB)

    Airapetian, Vladimir S.; Usmanov, Arcadi V., E-mail: vladimir.airapetian@nasa.gov, E-mail: avusmanov@gmail.com [NASA Goddard Space Flight Center, Greenbelt, MD (United States)

    2016-02-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.

  11. RECONSTRUCTING THE SOLAR WIND FROM ITS EARLY HISTORY TO CURRENT EPOCH

    International Nuclear Information System (INIS)

    Airapetian, Vladimir S.; Usmanov, Arcadi V.

    2016-01-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion

  12. Evolution of life history and behavior in Hominidae: towards phylogenetic reconstruction of the chimpanzee-human last common ancestor.

    Science.gov (United States)

    Duda, Pavel; Zrzavý, Jan

    2013-10-01

    The origin of the fundamental behavioral differences between humans and our closest living relatives is one of the central issues of evolutionary anthropology. The prominent, chimpanzee-based referential model of early hominin behavior has recently been challenged on the basis of broad multispecies comparisons and newly discovered fossil evidence. Here, we argue that while behavioral data on extant great apes are extremely relevant for reconstruction of ancestral behaviors, these behaviors should be reconstructed trait by trait using formal phylogenetic methods. Using the widely accepted hominoid phylogenetic tree, we perform a series of character optimization analyses using 65 selected life-history and behavioral characters for all extant hominid species. This analysis allows us to reconstruct the character states of the last common ancestors of Hominoidea, Hominidae, and the chimpanzee-human last common ancestor. Our analyses demonstrate that many fundamental behavioral and life-history attributes of hominids (including humans) are evidently ancient and likely inherited from the common ancestor of all hominids. However, numerous behaviors present in extant great apes represent their own terminal autapomorphies (both uniquely derived and homoplastic). Any evolutionary model that uses a single extant species to explain behavioral evolution of early hominins is therefore of limited use. In contrast, phylogenetic reconstruction of ancestral states is able to provide a detailed suite of behavioral, ecological and life-history characters for each hypothetical ancestor. The living great apes therefore play an important role for the confident identification of the traits found in the chimpanzee-human last common ancestor, some of which are likely to represent behaviors of the fossil hominins. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    Energy Technology Data Exchange (ETDEWEB)

    Muir Wood, R [EQE International Ltd (United Kingdom)

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.

  14. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    International Nuclear Information System (INIS)

    Muir Wood, R.

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can most readily be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume 'push'. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10 -9 /year. Within the Baltic Shield long term strain rates have been around 10 -1 1/year, with little evidence for evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently very little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the 'Current Tectonic Regime' is of Quaternary age although the orientation of the major stress axis has remained approximately consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather

  15. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history

    OpenAIRE

    Cherry, Joshua L.

    2017-01-01

    Background Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Results Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data....

  16. Reconstructing the cosmic expansion history up to redshift z=6.29 with the calibrated gamma-ray bursts

    International Nuclear Information System (INIS)

    Wei, Hao; Nan Zhang, Shuang

    2009-01-01

    Recently, Gamma-Ray Bursts (GRBs) were proposed to be a complementary cosmological probe to type Ia supernovae (SNIa). GRBs have been advocated to be standard candles since several empirical GRB luminosity relations were proposed as distance indicators. However, there is a so-called circularity problem in the direct use of GRBs. Recently, a new idea to calibrate GRBs in a completely cosmology independent manner has been proposed, and the circularity problem can be solved. In the present work, following the method proposed by Liang et al., we calibrate 70 GRBs with the Amati relation using 307 SNIa. Then, following the method proposed by Shafieloo et al., we smoothly reconstruct the cosmic expansion history up to redshift z=6.29 with the calibrated GRBs. We find some new features in the reconstructed results. (orig.)

  17. A Linear Dynamical Systems Approach to Streamflow Reconstruction Reveals History of Regime Shifts in Northern Thailand

    Science.gov (United States)

    Nguyen, Hung T. T.; Galelli, Stefano

    2018-03-01

    Catchment dynamics is not often modeled in streamflow reconstruction studies; yet, the streamflow generation process depends on both catchment state and climatic inputs. To explicitly account for this interaction, we contribute a linear dynamic model, in which streamflow is a function of both catchment state (i.e., wet/dry) and paleoclimatic proxies. The model is learned using a novel variant of the Expectation-Maximization algorithm, and it is used with a paleo drought record—the Monsoon Asia Drought Atlas—to reconstruct 406 years of streamflow for the Ping River (northern Thailand). Results for the instrumental period show that the dynamic model has higher accuracy than conventional linear regression; all performance scores improve by 45-497%. Furthermore, the reconstructed trajectory of the state variable provides valuable insights about the catchment history—e.g., regime-like behavior—thereby complementing the information contained in the reconstructed streamflow time series. The proposed technique can replace linear regression, since it only requires information on streamflow and climatic proxies (e.g., tree-rings, drought indices); furthermore, it is capable of readily generating stochastic streamflow replicates. With a marginal increase in computational requirements, the dynamic model brings more desirable features and value to streamflow reconstructions.

  18. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied...... on a structure. In brief the major problem that accompanies reconstruction of true displacement from acceleration record is the unreal drift observed in the double integrated acceleration. Purpose of the present work is to address source of the problem, introduce its treatments, show how they work and compare...

  19. The Influence of Sampling Density on Bayesian Age-Depth Models and Paleoclimatic Reconstructions - Lessons Learned from Lake Titicaca - Bolivia/Peru

    Science.gov (United States)

    Salenbien, W.; Baker, P. A.; Fritz, S. C.; Guedron, S.

    2014-12-01

    Lake Titicaca is one of the most important archives of paleoclimate in tropical South America, and prior studies have elucidated patterns of climate variation at varied temporal scales over the past 0.5 Ma. Yet, slow sediment accumulation rates in the main deeper basin of the lake have precluded analysis of the lake's most recent history at high resolution. To obtain a paleoclimate record of the last few millennia at multi-decadal resolution, we obtained five short cores, ranging from 139 to 181 cm in length, from the shallower Wiñaymarka sub-basin of of Lake Titicaca, where sedimentation rates are higher than in the lake's main basin. Selected cores have been analyzed for their geochemical signature by scanning XRF, diatom stratigraphy, sedimentology, and for 14C age dating. A total of 72 samples were 14C-dated using a Gas Ion Source automated high-throughput method for carbonate samples (mainly Littoridina sp. and Taphius montanus gastropod shells) at NOSAMS (Woods Hole Oceanographic Institute) with an analytical precision higher than 2%. The method has lower analytical precision compared with traditional AMS radiocarbon dating, but the lower cost enables analysis of a larger number of samples, and the error associated with the lower precision is relatively small for younger samples (< ~8,000 years). A 172-cm-long core was divided into centimeter long sections, and 47 14C dates were obtained from 1-cm intervals, averaging one date every 3-4 cm. The other cores were radiocarbon dated with a sparser sampling density that focused on visual unconformities and shell beds. The high-resolution radiocarbon analysis reveals complex sedimentation patterns in visually continuous sections, with abundant indicators of bioturbated or reworked sediments and periods of very rapid sediment accumulation. These features are not evident in the sparser sampling strategy but have significant implications for reconstructing past lake level and paleoclimatic history.

  20. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2012-01-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…

  1. RECONSTRUCTING THE EVOLUTIONARY HISTORY OF THE FOREST FUNGAL PATHOGEN, ARMILLARIA MELLEA, IN A TEMPERATE WORLDWIDE POPULATIONS

    Science.gov (United States)

    The forest pathogen Armillaria mellea s.s. (Basidiomycota, Physalacriaceae) is among the most significant forest pathogens causing root rot in northern temperate forest trees worldwide. Phylogenetic reconstructions for A. mellea show distinct European, Asian and North American lineages. The North Am...

  2. ROLE OF ARCHIVAL DOCUMENTS IN RECONSTRUCTION OF HISTORY OF STUDENT’S DEPARTMENT OF LIBRARY OF THE NOVOROSSIYSK UNIVERSIT

    Directory of Open Access Journals (Sweden)

    М. О. Подрезова

    2016-12-01

    Full Text Available In article, made attempt to unite separate data on history and activity of Student’s library of the Odessa I. I. Mechnikov National University. By means of newly opened archival materials, some moments from history of creation of fund of the Room for reading students in Richelieu Lyceum are reconstructed. The role of the trustee of the Odessa educational district N. I. Pirogov in creation of Student’s library and process of its further transformation in student’s department of library of the Novorossiysk University is shown. The moments of completing of fund of library by donation and purchase of books in different years of its activity are considered. Data on obtaining the books and money according to the will of the university doctor P. A. Ivanov aimed at the development of educational and auxiliary institutions of the Novorossiysk University are in detail stated.

  3. Contact-induced change in Dolgan : an investigation into the role of linguistic data for the reconstruction of a people's (pre)history

    NARCIS (Netherlands)

    Stapert, Eugénie

    2013-01-01

    This study explores the role of linguistic data in the reconstruction of Dolgan (pre)history. While most ethno-linguistic groups have a longstanding history and a clear ethnic and linguistic affiliation, the formation of the Dolgans has been a relatively recent development, and their ethnic origins

  4. Bayesian deterministic decision making: A normative account of the operant matching law and heavy-tailed reward history dependency of choices

    Directory of Open Access Journals (Sweden)

    Hiroshi eSaito

    2014-03-01

    Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.

  5. Reconciling deep calibration and demographic history: bayesian inference of post glacial colonization patterns in Carcinus aestuarii (Nardo, 1847 and C. maenas (Linnaeus, 1758.

    Directory of Open Access Journals (Sweden)

    Ilaria A M Marino

    Full Text Available A precise inference of past demographic histories including dating of demographic events using bayesian methods can only be achieved with the use of appropriate molecular rates and evolutionary models. Using a set of 596 mitochondrial cytochrome c oxidase I (COI sequences of two sister species of European green crabs of the genus Carcinus (C. maenas and C. aestuarii, our study shows how chronologies of past evolutionary events change significantly with the application of revised molecular rates that incorporate biogeographic events for calibration and appropriate demographic priors. A clear signal of demographic expansion was found for both species, dated between 10,000 and 20,000 years ago, which places the expansions events in a time frame following the Last Glacial Maximum (LGM. In the case of C. aestuarii, a population expansion was only inferred for the Adriatic-Ionian, suggestive of a colonization event following the flooding of the Adriatic Sea (18,000 years ago. For C. maenas, the demographic expansion inferred for the continental populations of West and North Europe might result from a northward recolonization from a southern refugium when the ice sheet retreated after the LGM. Collectively, our results highlight the importance of using adequate calibrations and demographic priors in order to avoid considerable overestimates of evolutionary time scales.

  6. Potential pitfalls of reconstructing deep time evolutionary history with only extant data, a case study using the canidae (mammalia, carnivora).

    Science.gov (United States)

    Finarelli, John A; Goswami, Anjali

    2013-12-01

    Reconstructing evolutionary patterns and their underlying processes is a central goal in biology. Yet many analyses of deep evolutionary histories assume that data from the fossil record is too incomplete to include, and rely solely on databases of extant taxa. Excluding fossil taxa assumes that character state distributions across living taxa are faithful representations of a clade's entire evolutionary history. Many factors can make this assumption problematic. Fossil taxa do not simply lead-up to extant taxa; they represent now-extinct lineages that can substantially impact interpretations of character evolution for extant groups. Here, we analyze body mass data for extant and fossil canids (dogs, foxes, and relatives) for changes in mean and variance through time. AIC-based model selection recovered distinct models for each of eight canid subgroups. We compared model fit of parameter estimates for (1) extant data alone and (2) extant and fossil data, demonstrating that the latter performs significantly better. Moreover, extant-only analyses result in unrealistically low estimates of ancestral mass. Although fossil data are not always available, reconstructions of deep-time organismal evolution in the absence of deep-time data can be highly inaccurate, and we argue that every effort should be made to include fossil data in macroevolutionary studies. © 2013 The Authors. Evolution published by Wiley Periodicals, Inc. on behalf of The Society for the Study of Evolution.

  7. Reconstruction of bomb 14C time history recorded in the stalagmite from Postojna Cave

    International Nuclear Information System (INIS)

    Vokal, B.; Genty, D.; Obelic, B.

    2002-01-01

    The karstic caves provide valuable resources for reconstruction of environmental conditions on the continent in the past. This is possible due to the great stability of climatic conditions within a cave. Secondary minerals deposited in caves, known as speleothems, preserve records of long-term climatic and environmental changes at the site of their deposition and in the vicinity. The purity of speleothems and their chemical and physical stability make them exceptionally well suited for detailed geochemical and isotopic analysis

  8. Substrata Residue, Linguistic Reconstruction, and Linking: Methodological Premises, and the Case History of Palaeo-Sardinian

    Directory of Open Access Journals (Sweden)

    Eduardo Blasco Ferrer

    2015-12-01

    Full Text Available This paper demonstrates that, within substratal research, prior to undertaking any comparative endeavour, it is necessary to conduct a thorough distributional analysis of the morphological regularities displayed by the language under consideration, so as to determine the phonological rules governing diachronic changes, which leads to establishing the typology of the substratal language. The application of this rigorous methodology to Palaeo-Sardinian toponymic material makes it possible to recognize the primitive agglutinative typology, and thereby to precise its relation to Palaeo-Basque. After having highlighted some flaws and weaknesses of prior reconstructions, the author first describes the benefits stemming from a systematic segmentation of nearly 1000 microtoponyms of Central Sardinia, which display clear morphological regularities, and restores the underlying phonological system, as well as some of the most distinctive evolutionary laws (e.g., it is argued that the structure of most reconstructed roots can be boiled down to a single syllable template CVC, as /d-u-r/, /d-o-n/; this helps to establish some phonetic laws, as /d/ > /l/ in dur > lur, don > loh, etc.. Finally, a detailed confrontation of Palaeo-Sardinian with reconstructed morphological and phonological systems of Palaeo-Basque evince a vast array of striking correspondances which are due, most probably, to the prehistoric split of Pre-Proto-Basque into Proto-Basque and Palaeo-Sardinian branches in the late Mesolithic / early Neolithic age. The paper provides a new Stammbaum model to account for this split.

  9. Bayesian and likelihood phylogenetic reconstructions of morphological traits are not discordant when taking uncertainty into consideration: a comment on Puttick et al.

    Science.gov (United States)

    Brown, Joseph W; Parins-Fukuchi, Caroline; Stull, Gregory W; Vargas, Oscar M; Smith, Stephen A

    2017-10-11

    Puttick et al. (2017 Proc. R. Soc. B 284 , 20162290 (doi:10.1098/rspb.2016.2290)) performed a simulation study to compare accuracy among methods of inferring phylogeny from discrete morphological characters. They report that a Bayesian implementation of the Mk model (Lewis 2001 Syst. Biol. 50 , 913-925 (doi:10.1080/106351501753462876)) was most accurate (but with low resolution), while a maximum-likelihood (ML) implementation of the same model was least accurate. They conclude by strongly advocating that Bayesian implementations of the Mk model should be the default method of analysis for such data. While we appreciate the authors' attempt to investigate the accuracy of alternative methods of analysis, their conclusion is based on an inappropriate comparison of the ML point estimate, which does not consider confidence, with the Bayesian consensus, which incorporates estimation credibility into the summary tree. Using simulation, we demonstrate that ML and Bayesian estimates are concordant when confidence and credibility are comparably reflected in summary trees, a result expected from statistical theory. We therefore disagree with the conclusions of Puttick et al. and consider their prescription of any default method to be poorly founded. Instead, we recommend caution and thoughtful consideration of the model or method being applied to a morphological dataset. © 2017 The Author(s).

  10. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  11. Bayesian Utilitarianism

    OpenAIRE

    ZHOU, Lin

    1996-01-01

    In this paper I consider social choices under uncertainty. I prove that any social choice rule that satisfies independence of irrelevant alternatives, translation invariance, and weak anonymity is consistent with ex post Bayesian utilitarianism

  12. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    Energy Technology Data Exchange (ETDEWEB)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)

    1997-09-01

    {sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.

  13. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    International Nuclear Information System (INIS)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R.; Kubik, P.W.; Bonani, G.; Hajdas, I.; Suter, M.

    1997-01-01

    10 Be, 14 C, and 26 Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs

  14. Literacy Models and the Reconstruction of History Education: A Comparative Discourse Analysis of Two Lesson Plans

    Science.gov (United States)

    Collin, Ross; Reich, Gabriel A.

    2015-01-01

    This article presents discourse analyses of two lesson plans designed for secondary school history classes. Although the plans focus on the same topic, they rely on different models of content area literacy: disciplinary literacy, or reading and writing like experts in a given domain, and critical literacy, or reading and writing to address…

  15. Using sedimentary archives to reconstruct pollution history and sediment provenance: The Ohre River, Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Matys Grygar, Tomáš; Elznicová, J.; Kiss, T.; Smith, H. G.

    2016-01-01

    Roč. 144, SEP (2016), s. 109-129 ISSN 0341-8162 R&D Projects: GA ČR GA15-00340S Institutional support: RVO:61388980 Keywords : Polluted floodplains * Pollution history * Fluvial archives * Sediment provenance * Environmental geochemistry Subject RIV: DD - Geochemistry Impact factor: 3.191, year: 2016

  16. Reconstruction of burial history, temperature, source rock maturity and hydrocarbon generation in the northwestern Dutch offshore

    NARCIS (Netherlands)

    Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten

    2012-01-01

    3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the

  17. History of cervical spine surgery: from nihilism to advanced reconstructive surgery.

    Science.gov (United States)

    Dweik, A; Van den Brande, E; Kossmann, T; Maas, A I R

    2013-11-01

    Review of literature. To review and analyze the evolution of cervical spine surgery from ancient times to current practice. The aim is to present an accessible overview, primarily intended for a broad readership. Descriptive literature review and analysis of the development of cervical spine surgery from the prehistoric era until today. The first evidence for surgical treatment of spinal disorders dates back to approximately 1500 BC. Conservative approaches to treatment have been the hallmark for thousands of years, but over the past 50 years progress has been rapid. We illustrate how nations have added elements to this complex subject and how knowledge has surpassed borders and language barriers. Transferral of knowledge occurred from Babylon (Bagdad) to Old Egypt, to the Greek and Roman empires and finally via the Middle East (Bagdad and Damascus) back to Europe. Recent advances in the field of anesthesia, imaging and spinal instrumentation have changed long-standing nihilism in the treatment of cervical spine pathologies to the current practice of advanced reconstructive surgery of the cervical spine. A critical approach to the evaluation of benefits and complications of these advanced surgical techniques for treatment of cervical spine disorders is required. Advances in surgery now permit full mechanical reconstruction of the cervical spine. However, despite substantial experimental progress, spinal cord repair and restoration of lost functions remain a challenge. Modern surgeons are still looking for the best way to manage spine disorders.

  18. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history.

    Science.gov (United States)

    Cherry, Joshua L

    2017-02-23

    Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data. The algorithm is applied to bacterial data sets containing up to nearly 2000 genomes with several thousand variable nucleotide sites. Run times are several seconds or less. Computational experiments show that maximum compatibility is less sensitive than maximum parsimony to the inclusion of nucleotide data that, though derived from actual sequence reads, has been identified as likely to be misleading. Maximum compatibility is a useful tool for certain phylogenetic problems, such as inferring the relationships among closely-related bacteria from whole-genome sequence data. The algorithm presented here rapidly solves fairly large problems of this type, and provides robustness against misleading characters than can pollute large-scale sequencing data.

  19. Genetic Structuration, Demography and Evolutionary History of Mycobacterium tuberculosis LAM9 Sublineage in the Americas as Two Distinct Subpopulations Revealed by Bayesian Analyses

    Science.gov (United States)

    Reynaud, Yann; Millet, Julie; Rastogi, Nalin

    2015-01-01

    Tuberculosis (TB) remains broadly present in the Americas despite intense global efforts for its control and elimination. Starting from a large dataset comprising spoligotyping (n = 21183 isolates) and 12-loci MIRU-VNTRs data (n = 4022 isolates) from a total of 31 countries of the Americas (data extracted from the SITVIT2 database), this study aimed to get an overview of lineages circulating in the Americas. A total of 17119 (80.8%) strains belonged to the Euro-American lineage 4, among which the most predominant genotypic family belonged to the Latin American and Mediterranean (LAM) lineage (n = 6386, 30.1% of strains). By combining classical phylogenetic analyses and Bayesian approaches, this study revealed for the first time a clear genetic structuration of LAM9 sublineage into two subpopulations named LAM9C1 and LAM9C2, with distinct genetic characteristics. LAM9C1 was predominant in Chile, Colombia and USA, while LAM9C2 was predominant in Brazil, Dominican Republic, Guadeloupe and French Guiana. Globally, LAM9C2 was characterized by higher allelic richness as compared to LAM9C1 isolates. Moreover, LAM9C2 sublineage appeared to expand close to twenty times more than LAM9C1 and showed older traces of expansion. Interestingly, a significant proportion of LAM9C2 isolates presented typical signature of ancestral LAM-RDRio MIRU-VNTR type (224226153321). Further studies based on Whole Genome Sequencing of LAM strains will provide the needed resolution to decipher the biogeographical structure and evolutionary history of this successful family. PMID:26517715

  20. Cosmology with hybrid expansion law: scalar field reconstruction of cosmic history and observational constraints

    International Nuclear Information System (INIS)

    Akarsu, Özgür; Kumar, Suresh; Myrzakulov, R.; Sami, M.; Xu, Lixin

    2014-01-01

    In this paper, we consider a simple form of expansion history of Universe referred to as the hybrid expansion law - a product of power-law and exponential type of functions. The ansatz by construction mimics the power-law and de Sitter cosmologies as special cases but also provides an elegant description of the transition from deceleration to cosmic acceleration. We point out the Brans-Dicke realization of the cosmic history under consideration. We construct potentials for quintessence, phantom and tachyon fields, which can give rise to the hybrid expansion law in general relativity. We investigate observational constraints on the model with hybrid expansion law applied to late time acceleration as well as to early Universe a la nucleosynthesis

  1. History of consumer demand theory 1871-1971: A Neo-Kantian rational reconstruction

    OpenAIRE

    Moscati Ivan

    2007-01-01

    This paper examines the history of the neoclassical theory of consumer demand from 1871 to 1971 by bringing into play the knowledge theory of the Marburg School, a Neo-Kantian philosophical movement. The work aims to show the usefulness of a Marburg-inspired epistemology in rationalizing the development of consumer analysis and, more generally, to understand the principles that regulate the process of knowing in neoclassical economics.

  2. History of consumer demand theory 1871-1971: A Neo-Kantian rational reconstruction

    OpenAIRE

    Ivan Moscati

    2005-01-01

    This paper examines the history of the neoclassical theory of consumer demand from 1871 to 1971 by bringing into play the knowledge theory of the Marburg School, a Neo-Kantian philosophical movement. The work aims to show the usefulness of a Marburg-inspired epistemology in rationalizing the development of consumer analysis and, more generally, to understand the principles that regulate the process of knowing in neoclassical economics.

  3. Assessing dorsal scute microchemistry for reconstruction of shortnose sturgeon life histories

    Science.gov (United States)

    Altenritter, Matthew E.; Kinnison, Michael T.; Zydlewski, Gayle B.; Secor, David H.; Zydlewski, Joseph D.

    2015-01-01

    The imperiled status of sturgeons worldwide places priority on the identification and protection of critical habitats. We assessed the micro-structural and micro-chemical scope for a novel calcified structure, dorsal scutes, to be used for reconstruction of past habitat use and group separation in shortnose sturgeon (Acipenser brevirostrum). Dorsal scutes contained a dual-layered structure composed of a thin multi-layered translucent zone lying dorsally above a thicker multi-layered zone. Banding in the thick multi-layered zone correlated strongly with pectoral fin spine annuli supporting the presence of chronological structuring that could contain a chemical record of past environmental exposure. Trace element profiles (Sr:Ca), collected using both wavelength dispersive electron microprobe analysis and laser ablation inductively coupled mass spectrometry, suggest scutes record elemental information useful for tracing transitions between freshwater and marine environments. Moreover, mirror-image like Sr:Ca profiles were observed across the dual-zone structuring of the scute that may indicate duplication of the microchemical profile in a single structure. Additional element:calcium ratios measured in natal regions of dorsal scutes (Ba:Ca, Mg:Ca) suggest the potential for further refinement of techniques for identification of river systems of natal origin. In combination, our results provide proof of concept that dorsal scutes possess the necessary properties to be used as structures for reconstructions of past habitat use in sturgeons. Importantly, scutes may be collected non-lethally and with less injury than current structures, like otoliths and fin spines, affording an opportunity for broader application of microchemical techniques.

  4. Reconstruction of caribou evolutionary history in Western North America and its implications for conservation.

    Science.gov (United States)

    Weckworth, Byron V; Musiani, Marco; McDevitt, Allan D; Hebblewhite, Mark; Mariani, Stefano

    2012-07-01

    The role of Beringia as a refugium and route for trans-continental exchange of fauna during glacial cycles of the past 2million years are well documented; less apparent is its contribution as a significant reservoir of genetic diversity. Using mitochondrial DNA sequences and 14 microsatellite loci, we investigate the phylogeographic history of caribou (Rangifer tarandus) in western North America. Patterns of genetic diversity reveal two distinct groups of caribou. Caribou classified as a Northern group, of Beringian origin, exhibited greater number and variability in mtDNA haplotypes compared to a Southern group originating from refugia south of glacial ice. Results indicate that subspecies R. t. granti of Alaska and R. t. groenlandicus of northern Canada do not constitute distinguishable units at mtDNA or microsatellites, belying their current status as separate subspecies. Additionally, the Northern Mountain ecotype of woodland caribou (presently R. t. caribou) has closer kinship to caribou classified as granti or groenlandicus. Comparisons of mtDNA and microsatellite data suggest that behavioural and ecological specialization is a more recently derived life history characteristic. Notably, microsatellite differentiation among Southern herds is significantly greater, most likely as a result of human-induced landscape fragmentation and genetic drift due to smaller population sizes. These results not only provide important insight into the evolutionary history of northern species such as caribou, but also are important indicators for managers evaluating conservation measures for this threatened species. © 2012 Blackwell Publishing Ltd.

  5. Bayesian methods to restore and re build images: application to gamma-graphy and to photofission tomography; Methodes bayesiennes pour la restauration et la reconstruction d`images application a la gammagraphie et a la tomographie par photofissions

    Energy Technology Data Exchange (ETDEWEB)

    Stawinski, G

    1998-10-26

    Bayesian algorithms are developed to solve inverse problems in gamma imaging and photofission tomography. The first part of this work is devoted to the modeling of our measurement systems. Two models have been found for both applications: the first one is a simple conventional model and the second one is a cascaded point process model. EM and MCMC Bayesian algorithms for image restoration and image reconstruction have been developed for these models and compared. The cascaded point process model does not improve significantly the results previously obtained by the classical model. To original approaches have been proposed, which increase the results previously obtained. The first approach uses an inhomogeneous Markov Random Field as a prior law, and makes the regularization parameter spatially vary. However, the problem of the estimation of hyper-parameters has not been solved. In the case of the deconvolution of point sources, a second approach has been proposed, which introduces a high level prior model. The picture is modeled as a list of objects, whose parameters and number are unknown. The results obtained with this method are more accurate than those obtained with the conventional Markov Random Field prior model and require less computational costs. (author)

  6. Reconstructing and analyzing China's fifty-nine year (1951–2009 drought history using hydrological model simulation

    Directory of Open Access Journals (Sweden)

    Z. Y. Wu

    2011-09-01

    Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North

  7. A Blacker and Browner Shade of Pale: Reconstructing Punk Rock History

    OpenAIRE

    Pietschmann, Franziska

    2010-01-01

    Embedded in the transatlantic history of rock ‘n’ roll, punk rock has not only been regarded as a watershed moment in terms of music, aesthetics and music-related cultural practices, it has also been perceived as a subversive white cultural phenomenon. A Blacker and Browner Shade of Pale challenges this widespread and shortsighted assumption. People of color, particularly black Americans and Britons, and Latina/os have pro-actively contributed to punk’s evolution and shaped punk music culture...

  8. Reconstructing depositional processes and history from reservoir stratigraphy: Englebright Lake, Yuba River, northern California

    Science.gov (United States)

    Snyder, N.P.; Wright, S.A.; Alpers, Charles N.; Flint, L.E.; Holmes, C.W.; Rubin, D.M.

    2006-01-01

    Reservoirs provide the opportunity to link watershed history with its stratigraphic record. We analyze sediment cores from a northern California reservoir in the context of hydrologic history, watershed management, and depositional processes. Observations of recent depositional patterns, sediment-transport calculations, and 137CS geochronology support a conceptual model in which the reservoir delta progrades during floods of short duration (days) and is modified during prolonged (weeks to months) drawdowns that rework topset beds and transport sand from topsets to foresets. Sediment coarser than 0.25-0.5 mm. deposits in foresets and topsets, and finer material falls out of suspension as bottomset beds. Simple hydraulic calculations indicate that fine sand (0.063-0.5 mm) is transported into the distal bottomset area only during floods. The overall stratigraphy suggests that two phases of delta building occurred in the reservoir. The first, from dam construction in 1940 to 1970, was heavily influenced by annual, prolonged >20 m drawdowns of the water level. The second, built on top of the first, reflects sedimentation from 1970 to 2002 when the influence of drawdowns was less. Sedimentation rates in the central part of the reservoir have declined ???25% since 1970, likely reflecting a combination of fewer large floods, changes in watershed management, and winnowing of stored hydraulic mining sediment. Copyright 2006 by the American Geophysical Union.

  9. Testing sex and gender in sports; reinventing, reimagining and reconstructing histories.

    Science.gov (United States)

    Heggie, Vanessa

    2010-12-01

    Most international sports organisations work on the premise that human beings come in one of two genders: male or female. Consequently, all athletes, including intersex and transgender individuals, must be assigned to compete in one or other category. Since the 1930s (not, as is popularly suggested, the 1960s) these organisations have relied on scientific and medical professionals to provide an 'objective' judgement of an athlete's eligibility to compete in women's national and international sporting events. The changing nature of these judgements reflects a great deal about our cultural, social and national prejudices, while the matter of testing itself has become a site of conflict for feminists and human rights activists. Because of the sensitive nature of this subject, histories of sex testing are difficult to write and research; this has lead to the repetition of inaccurate information and false assertions about gender fraud, particularly in relation to the 'classic' cases of Stella Walsh and Heinrich/Hermann/Dora Ratjen. As historians, we need to be extremely careful to differentiate between mythologies and histories. Copyright © 2010 Elsevier Ltd. All rights reserved.

  10. Organic molecular paleohypsometry: A new approach to reconstructing the paleoelevation history of an orogen

    Science.gov (United States)

    Hren, M. T.; Ouimet, W. B.

    2017-12-01

    Paleoelevation data is critical to understanding the links and feedbacks between rock-uplift and erosion yet few approaches have proved successful in quantifying changes in paleoelevation rapidly eroding, tropical landscapes. In addition, quantitative methods of reconstructing paleoelevation from marine sedimentary archives are lacking. Here we present a new approach to quantifying changes in paleoelevation that is based on the geochemical signature of organic matter exported via the main river networks of an orogen. This new approach builds on fundamentals of stable isotope paleoaltimetry and is akin to the theory behind cosmogenic isotope records of catchment-integrated erosion. Specifically, we utilize predictable patterns of precipitation and organic molecular biomarker stable isotopes to relate the hypsometry of organic matter in a catchment to the geochemical signal in exported organic carbon. We present data from two sites (the cold temperate White Mountains of New Hampshire, USA and the tropical, rapidly eroding landscape of Taiwan) to demonstrate this relationship between exported carbon geochemistry and catchment hypsometry and the validity of this approach.

  11. Reconstructing the colonisation and diversification history of the endemic freshwater crab (Seychellum alluaudi) in the granitic and volcanic Seychelles Archipelago.

    Science.gov (United States)

    Daniels, Savel R

    2011-11-01

    The endemic, monotypic freshwater crab species Seychellum alluaudi was used as a template to examine the initial colonisation and evolutionary history among the major islands in the Seychelles Archipelago. Five of the "inner" islands in the Seychelles Archipelago including Mahé, Praslin, Silhouette, La Digue and Frégate were sampled. Two partial mtDNA fragments, 16S rRNA and cytochrome oxidase subunit I (COI) was sequenced for 83 specimens of S. alluaudi. Evolutionary relationships between populations were inferred from the combined mtDNA dataset using maximum parsimony, maximum likelihood and Bayesian inferences. Analyses of molecular variance (AMOVA) were used to examine genetic variation among and within clades. A haplotype network was constructed using TCS while BEAST was employed to date the colonisation and divergence of lineages on the islands. Phylogenetic analyses of the combined mtDNA data set of 1103 base pairs retrieved a monophyletic S. alluaudi group comprised three statistically well-supported monophyletic clades. Clade one was exclusive to Silhouette; clade two included samples from Praslin sister to La Digue, while clade three comprised samples from Mahé sister to Frégate. The haplotype network corresponded to the three clades. Within Mahé, substantial phylogeographic substructure was evident. AMOVA results revealed limited genetic variation within localities with most variation occurring among localities. Divergence time estimations predated the Holocene sea level regressions and indicated a Pliocene/Pleistocene divergence between the three clades evident within S. alluaudi. The monophyly of each clade suggests that transoceanic dispersal is rare. The absence of shared haplotypes between the three clades, coupled with marked sequence divergence values suggests the presence of three allospecies within S. alluaudi. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Fiction as Reconstruction of History: Narratives of the Civil War in American Literature

    Directory of Open Access Journals (Sweden)

    Reinhard Isensee

    2009-09-01

    Full Text Available Even after more than 140 years the American Civil War continues to serve as a major source of inspiration for a plethora of literature in various genres. While only amounting to a brief period in American history in terms of years, this war has proved to be one of the central moments for defining the American nation since the second half of the nineteenth century. The facets of the Civil War, its protagonists, places, events, and political, social and cultural underpinnings seem to hold an ongoing fascination for both academic studies and fictional representations. Thus, it has been considered by many the most written-about war in the United States.

  13. RECONSTRUCTING THE ACCRETION HISTORY OF THE GALACTIC STELLAR HALO FROM CHEMICAL ABUNDANCE RATIO DISTRIBUTIONS

    International Nuclear Information System (INIS)

    Lee, Duane M.; Johnston, Kathryn V.; Sen, Bodhisattva; Jessop, Will

    2015-01-01

    Observational studies of halo stars during the past two decades have placed some limits on the quantity and nature of accreted dwarf galaxy contributions to the Milky Way (MW) stellar halo by typically utilizing stellar phase-space information to identify the most recent halo accretion events. In this study we tested the prospects of using 2D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from 11 “MW-like” halos to generate satellite template sets (STSs) of 2D CARDs of accreted dwarf satellites, which are composed of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of ∼10 3–4 mock observations of stellar chemical abundance ratios ([α/Fe], [Fe/H]) from those 11 halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion histories in relation to the STS used and the sample size. For certain STSs used we typically can identify the relative mass contributions of all accreted satellites to within a factor of two. We also find that this method is particularly sensitive to older accretion events involving low-luminosity dwarfs, e.g., ultra-faint dwarfs—precisely those events that are too ancient to be seen by phase-space studies of stars and too faint to be seen by high-z studies of the early universe. Since our results only exploit two chemical dimensions and near-future surveys promise to provide ∼6–9 dimensions, we conclude that these new high-resolution spectroscopic surveys of the stellar halo will allow us to recover its accretion history—and the luminosity function of infalling dwarf galaxies—across cosmic time

  14. RECONSTRUCTING THE ACCRETION HISTORY OF THE GALACTIC STELLAR HALO FROM CHEMICAL ABUNDANCE RATIO DISTRIBUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Duane M. [Key Laboratory for Research in Galaxies and Cosmology, Shanghai Astronomical Observatory, Chinese Academy of Sciences, 80 Nandan Road, Shanghai 200030 (China); Johnston, Kathryn V. [Department of Astronomy, Columbia University, New York City, NY 10027 (United States); Sen, Bodhisattva; Jessop, Will, E-mail: duane@shao.ac.cn [Department of Statistics, Columbia University, New York City, NY 10027 (United States)

    2015-03-20

    Observational studies of halo stars during the past two decades have placed some limits on the quantity and nature of accreted dwarf galaxy contributions to the Milky Way (MW) stellar halo by typically utilizing stellar phase-space information to identify the most recent halo accretion events. In this study we tested the prospects of using 2D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from 11 “MW-like” halos to generate satellite template sets (STSs) of 2D CARDs of accreted dwarf satellites, which are composed of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of ∼10{sup 3–4} mock observations of stellar chemical abundance ratios ([α/Fe], [Fe/H]) from those 11 halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion histories in relation to the STS used and the sample size. For certain STSs used we typically can identify the relative mass contributions of all accreted satellites to within a factor of two. We also find that this method is particularly sensitive to older accretion events involving low-luminosity dwarfs, e.g., ultra-faint dwarfs—precisely those events that are too ancient to be seen by phase-space studies of stars and too faint to be seen by high-z studies of the early universe. Since our results only exploit two chemical dimensions and near-future surveys promise to provide ∼6–9 dimensions, we conclude that these new high-resolution spectroscopic surveys of the stellar halo will allow us to recover its accretion history—and the luminosity function of infalling dwarf galaxies—across cosmic time.

  15. Knee-Extension Torque Variability and Subjective Knee Function in Patients with a History of Anterior Cruciate Ligament Reconstruction.

    Science.gov (United States)

    Goetschius, John; Hart, Joseph M

    2016-01-01

    When returning to physical activity, patients with a history of anterior cruciate ligament reconstruction (ACL-R) often experience limitations in knee-joint function that may be due to chronic impairments in quadriceps motor control. Assessment of knee-extension torque variability may demonstrate underlying impairments in quadriceps motor control in patients with a history of ACL-R. To identify differences in maximal isometric knee-extension torque variability between knees that have undergone ACL-R and healthy knees and to determine the relationship between knee-extension torque variability and self-reported knee function in patients with a history of ACL-R. Descriptive laboratory study. Laboratory. A total of 53 individuals with primary, unilateral ACL-R (age = 23.4 ± 4.9 years, height = 1.7 ± 0.1 m, mass = 74.6 ± 14.8 kg) and 50 individuals with no history of substantial lower extremity injury or surgery who served as controls (age = 23.3 ± 4.4 years, height = 1.7 ± 0.1 m, mass = 67.4 ± 13.2 kg). Torque variability, strength, and central activation ratio (CAR) were calculated from 3-second maximal knee-extension contraction trials (90° of flexion) with a superimposed electrical stimulus. All participants completed the International Knee Documentation Committee (IKDC) Subjective Knee Evaluation Form, and we determined the number of months after surgery. Group differences were assessed using independent-samples t tests. Correlation coefficients were calculated among torque variability, strength, CAR, months after surgery, and IKDC scores. Torque variability, strength, CAR, and months after surgery were regressed on IKDC scores using stepwise, multiple linear regression. Torque variability was greater and strength, CAR, and IKDC scores were lower in the ACL-R group than in the control group (P Torque variability and strength were correlated with IKDC scores (P Torque variability, strength, and CAR were correlated with each other (P Torque variability alone

  16. Reconstructing the Life Histories of Spanish Primary School Teachers: A Novel Approach for the Study of the Teaching Profession and School Culture

    Science.gov (United States)

    Mahamud, Kira; Martínez Ruiz-Funes, María José

    2014-01-01

    This paper describes a study dealing with the reconstruction of the lives of two Spanish primary school teachers during the Franco dictatorship (1939-1975), in order to learn to what extent such a field of research can contribute to the history of education. Two family archives provide extraordinary and unique documentation to track down their…

  17. A Reconstruction of Development of the Periodic Table Based on History and Philosophy of Science and Its Implications for General Chemistry Textbooks

    Science.gov (United States)

    Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor

    2005-01-01

    The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…

  18. Late Pliocene Depositional History and Paleoclimate Reconstructions of the Southwest Pacific

    Science.gov (United States)

    Royce, B.; Patterson, M. O.; Pietras, J.

    2017-12-01

    Drift deposits off the eastern margin of New Zealand are important archives for the paleoclimate and paleoceanographic history of the southwest Pacific. Ocean Drilling Program (ODP) Site 1123 is located on the North Chatham rise drift just North of the westerly wind driven Subtropical Front (STF) and provides a record of near continuous sediment deposition since the Miocene along the southwest Pacific deep western boundary current (DWBC). While the Miocene and Late Pleistocene portion of this record have been well studied, the Late Pliocene record is less well developed. Southern Ocean geological records demonstrate that Late Pliocene cooling is the transient time bracketing the warmer than present Early Pliocene and bipolar glaciation at 2.7 Ma. A newly developed, robust, and astronomically tuned long-term record of benthic δ13C from ODP Site 1123 spanning the Early to Late Pliocene implies a reduction in Southern Ocean ventilation and lowering of preformed values from waters sourced along the Antarctic margin during the Late Pliocene. Thus, Late Pliocene Southern Hemisphere cooling and sea ice expansion may have drastically reduced outgassing and increased the burial of heat into the deep ocean. South Atlantic records off the west coast of Africa demonstrate an increase in the flux of iron to the open ocean during this time potentially enhancing surface ocean productivity and providing an additional cooling mechanism. Currently, atmospheric transport of dust to the Southern Ocean is dominated by persistent mid-latitude circumpolar westerly winds; this is particularly relevant for dust sourced from New Zealand. The Late Pliocene to Early Pleistocene uplift of the North Island axial ranges and South Island southern alps potentially provided a greater amount of not only sediment to the deep ocean, but also wind blow dust to the Pacific sector of the Southern Ocean. We will present a detailed high-resolution sedimentological study on the development of the Chatham

  19. Reconstructing disturbance history for an intensively mined region by time-series analysis of Landsat imagery.

    Science.gov (United States)

    Li, Jing; Zipper, Carl E; Donovan, Patricia F; Wynne, Randolph H; Oliphant, Adam J

    2015-09-01

    Surface mining disturbances have attracted attention globally due to extensive influence on topography, land use, ecosystems, and human populations in mineral-rich regions. We analyzed a time series of Landsat satellite imagery to produce a 28-year disturbance history for surface coal mining in a segment of eastern USA's central Appalachian coalfield, southwestern Virginia. The method was developed and applied as a three-step sequence: vegetation index selection, persistent vegetation identification, and mined-land delineation by year of disturbance. The overall classification accuracy and kappa coefficient were 0.9350 and 0.9252, respectively. Most surface coal mines were identified correctly by location and by time of initial disturbance. More than 8 % of southwestern Virginia's >4000-km(2) coalfield area was disturbed by surface coal mining over the 28-year period. Approximately 19.5 % of the Appalachian coalfield surface within the most intensively mined county (Wise County) has been disturbed by mining. Mining disturbances expanded steadily and progressively over the study period. Information generated can be applied to gain further insight concerning mining influences on ecosystems and other essential environmental features.

  20. Reconstruction of the paleothermal history of the sub-basin, Lower Guajira, Colombia

    International Nuclear Information System (INIS)

    Garcia Gonzalez, Mario; Mier Umana, Ricardo; Cruz Guevara, Luis Enrique

    2010-01-01

    The paleothermal history of the lower Guajira sub-basin was achieved using three methods: 1) calculation of the presents geothermal gradient and heat flow, 2) vitrinite reflectance and, 3) fission track analysis on apatites and zircon grains. New analytical data of vitrinite reflectance and fission tracks allowed identifying four thermal events with the following features: the oldest thermal event took place during the late cretaceous between 95 and 65 ma. The second thermal event occurred during the late Eocene between 40 and 35 ma. The third thermal event occurred in the early and middle Miocene between 22 and 15 ma. The fourth and last thermal event took place in the late Miocene between 10 and 5 ma. The cooling events match with four unconformities previously identified: 1) the late cretaceous unconformity at the top of the Guaralamai Formation, 2) the late Eocene unconformity at the base of the Siamana Formation 3) the early to middle Miocene unconformity on the top of the Siamana Formation, and 4) the late Miocene unconformity on the top of the Castilletes Formation.

  1. Reconstruction of El Niño history from reef corals

    Directory of Open Access Journals (Sweden)

    1993-01-01

    Full Text Available RECONSTRUCTION DE L'HISTOIRE DE EL NIÑO À PARTIR DES RÉCIFS DE CORAUX. Parmi toutes les phases trouvées en milieu marin, les carbonates de diverses origines biologiques se sont révélés être la source la plus riche en information paléo-océanographique et paléo-climatique. Au cours des 20 dernières années, une méthode de travail s'est développée pour permettre l'interprétation de l'information sur les coraux des récifs d'aragonite. Ces coraux fournissent une information particulièrement utile pour leur large distribution, leur rubanement en fonction du temps et leur géochimie appropriée pour enregistrer l'information du milieu ambiant. Cet article rend compte de ces caractéristiques depuis la perspective des influences historiques du El Niño-Oscillation Austral sur l'Océan Pacifique Tropical. RECONSTRUCCIÓN DE LA HISTORIA DE EL NIÑO A PARTIR DE ARRECIFES CORALINOS. De todas las fases minerales encontradas en el ambiente marino, los carbonatos de varios orígenes biológicos han demostrado ser el repositorio más rico de información paleoceanográfica y paleoclimática. Durante las dos últimas décadas, se ha desarrollado un método de trabajo para permitir la interpretación de la información en los arrecifes de corales aragoníticos. Estos corales proveen un archivo especialmente útil por su amplia distribución, su bandeado temporal y su geoquímica apropiada para registrar información sobre el medio ambiente. Este artículo revisa estos atributos desde la perspectiva de las influencias históricas de El Niño-Oscilación Austral sobre el océano Pacífico tropical. Of all the mineral phases found in the marine environment, carbonates of various biological origins have proven to be the richest repository of paleoceanographic / paleoclimatic information. Over the last two decades, a process framework has evolved to permit the interpretation of such information in aragonitic reef corals. Corals comprise a

  2. The reconstructive study in arcaheology: case histories in the communication issues

    Directory of Open Access Journals (Sweden)

    Francesco Gabellone

    2011-09-01

    Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l

  3. "Like sugar in milk": reconstructing the genetic history of the Parsi population.

    Science.gov (United States)

    Chaubey, Gyaneshwer; Ayub, Qasim; Rai, Niraj; Prakash, Satya; Mushrif-Tripathy, Veena; Mezzavilla, Massimo; Pathak, Ajai Kumar; Tamang, Rakesh; Firasat, Sadaf; Reidla, Maere; Karmin, Monika; Rani, Deepa Selvi; Reddy, Alla G; Parik, Jüri; Metspalu, Ene; Rootsi, Siiri; Dalal, Kurush; Khaliq, Shagufta; Mehdi, Syed Qasim; Singh, Lalji; Metspalu, Mait; Kivisild, Toomas; Tyler-Smith, Chris; Villems, Richard; Thangaraj, Kumarasamy

    2017-06-14

    The Parsis are one of the smallest religious communities in the world. To understand the population structure and demographic history of this group in detail, we analyzed Indian and Pakistani Parsi populations using high-resolution genetic variation data on autosomal and uniparental loci (Y-chromosomal and mitochondrial DNA). Additionally, we also assayed mitochondrial DNA polymorphisms among ancient Parsi DNA samples excavated from Sanjan, in present day Gujarat, the place of their original settlement in India. Among present-day populations, the Parsis are genetically closest to Iranian and the Caucasus populations rather than their South Asian neighbors. They also share the highest number of haplotypes with present-day Iranians and we estimate that the admixture of the Parsis with Indian populations occurred ~1,200 years ago. Enriched homozygosity in the Parsi reflects their recent isolation and inbreeding. We also observed 48% South-Asian-specific mitochondrial lineages among the ancient samples, which might have resulted from the assimilation of local females during the initial settlement. Finally, we show that Parsis are genetically closer to Neolithic Iranians than to modern Iranians, who have witnessed a more recent wave of admixture from the Near East. Our results are consistent with the historically-recorded migration of the Parsi populations to South Asia in the 7th century and in agreement with their assimilation into the Indian sub-continent's population and cultural milieu "like sugar in milk". Moreover, in a wider context our results support a major demographic transition in West Asia due to the Islamic conquest.

  4. Bayesian integration of radioisotope dating (210Pb, 137Cs, 241Am, 14C) and an 18-20th century mining history of Brotherswater, English Lake District

    Science.gov (United States)

    Schillereff, Daniel; Chiverrell, Richard; Macdonald, Neil; Hooke, Janet; Welsh, Katharine; Piliposyan, Gayane; Appleby, Peter

    2014-05-01

    Lake sediment records are often a useful tool for investigating landscape evolution as geomorphic changes in the catchment are reflected by altered sediment properties in the material transported through the watershed and deposited at the lake bed. Recent research at Brotherswater, an upland waterbody in the Lake District, northwest England, has focused on reconstructing historical floods from their sedimentary signatures and calculating long-term sediment and carbon budgets from fourteen sediment cores extracted from across the basin. Developing accurate chronological control is essential for these tasks. One sediment core (BW11-2; 3.5 m length) from the central basin has been dated using artificial radionuclide measurements (210Pb, 137Cs, 241Am) for the uppermost sediments and radiocarbon (14C) for lower sediments. The core appears to span the past 1500 years, however a number of problems have arisen. We present our explanations for these errors, the independent chronological techniques used to generate an accurate age-depth model for this core and methods for its transferral to the other 13 cores extracted from the basin. Two distinct 137Cs markers, corresponding to the 1986 Chernobyl disaster and 1960s weapons testing, confirm the 210Pb profile for sediment deposition since ~1950, but calculations prior to this appear erroneous, possibly due to a hiatus in the sediment record. We used high-resolution geochemical profiles (measured by XRF) to cross-correlate with a second 210Pb-dated chronology from a more distal location, which returned more sensible results. Unfortunately, the longer 14C sequence exhibits two age-reversals (radiocarbon dates that are too old). We believe the uppermost two dates are erroneous, due to a shift in inflow location as a flood prevention method ~1900 A.D., dated using information from historical maps. The lower age-reversal coincides with greater supply of terrigenous material to the lake (increased Zr, K, Ti concentrations

  5. Role of stable isotope analyses in reconstructing past life-histories and the provenancing human skeletal remains: a review

    Directory of Open Access Journals (Sweden)

    Sehrawat Jagmahender Singh

    2017-09-01

    Full Text Available This article reviews the present scenario of use of stable isotopes (mainly δ13C, δ15N, δ18O, 87Sr to trace past life behaviours like breast feeding and weaning practices, the geographic origin, migration history, paleodiet and subsistence patterns of past populations from the chemical signatures of isotopes imprinted in human skeletal remains. This approach is based on the state that food-web isotopic signatures are seen in the human bones and teeth and such signatures can change parallely with a variety of biogeochemical processes. By measuring δ13C and δ15N isotopic values of subadult tissues of different ages, the level of breast milk ingestion at particular ages and the components of the complementary foods can be assessed. Strontium and oxygen isotopic analyses have been used for determining the geographic origins and reconstructing the way of life of past populations as these isotopes can map the isotopic outline of the area from where the person acquired water and food during initial lifetime. The isotopic values of strontium and oxygen values are considered specific to geographical areas and serve as reliable chemical signatures of migration history of past human populations (local or non-local to the site. Previous isotopic studies show that the subsistence patterns of the past human populations underwent extensive changes from nomadic to complete agricultural dependence strategies. The carbon and nitrogen isotopic values of local fauna of any archaeological site can be used to elucidate the prominence of freshwater resources in the diet of the past human populations found near the site. More extensive research covering isotopic descriptions of various prehistoric, historic and modern populations is needed to explore the role of stable isotope analysis for provenancing human skeletal remains and assessing human migration patterns/routes, geographic origins, paleodiet and subsistence practices of past populations.

  6. Cenozoic basin thermal history reconstruction and petroleum systems in the eastern Colombian Andes

    Science.gov (United States)

    Parra, Mauricio; Mora, Andres; Ketcham, Richard A.; Stockli, Daniel F.; Almendral, Ariel

    2017-04-01

    Late Mesozoic-Cenozoic retro-arc foreland basins along the eastern margin of the Andes in South America host the world's best detrital record for the study of subduction orogenesis. There, the world's most prolific petroleum system occur in the northernmost of these foreland basin systems, in Ecuador, Colombia and Venezuela, yet over 90% of the discovered hydrocarbons there occur in one single province in norteastern Venezuela. A successful industry-academy collaboration applied a multidisciplinary approach to the study of the north Andes with the aim of investigating both, the driving mechanisms of orogenesis, and its impact on hydrocarbon accumulation in eastern Colombia. The Eastern Cordillera is an inversion orogen located at the leading edge of the northern Andes. Syn-rift subsidence favored the accumulation of km-thick organic matter rich shales in a back-arc basin in the early Cretaceous. Subsequent late Cretaceous thermal subsidence prompted the accumulation of shallow marine sandstones and shales, the latter including the Turonian-Cenomanian main hydrocarbon source-rock. Early Andean uplift since the Paleocene led to development of a flexural basin, filled with mainly non-marine strata. We have studied the Meso-Cenozoic thermal evolution of these basins through modeling of a large thermochronometric database including hundreds of apatite and zircon fission-track and (U-Th)/He data, as well as paleothermometric information based on vitrinite reflectance and present-day temperatures measured in boreholes. The detrital record of Andean construction was also investigated through detrital zircon U-Pb geochronometry in outcrop and borehole samples. A comprehensive burial/exhumation history has been accomplished through three main modeling strategies. First, one-dimensional subsidence was used to invert the pre-extensional lithospheric thicknesses, the magnitude of stretching, and the resulting heat flow associated to extension. The amount of eroded section and

  7. Reconstruction of the pollution history of alkylphenols (4-tert-octylphenol, 4-nonylphenol) in the Baltic Sea.

    Science.gov (United States)

    Graca, Bożena; Staniszewska, Marta; Zakrzewska, Danuta; Zalewska, Tamara

    2016-06-01

    This paper reports the reconstruction of the pollution history of 4-tert-octylphenol (OP) and 4-nonylphenol (NP) in the Baltic Sea. Alkylphenols are endocrine-disrupting compound and therefore toxic to aquatic organisms. Sediment cores were collected from regions with relatively stable sedimentation conditions. The cores were dated by the (210)Pb method. The OP and NP were determined using HPLC-FL. The highest inventory of these compounds was observed in the Gotland Deep (610 μg m(2) of NP and 47 μg m(2) of OP) and the lowest-on the slope of the Gdansk Deep (24 μg m(2) of NP and 16 μg m(2) of OP). Such spatial distribution was probably, among other factors, the result of the uplift of the sea floor. The pollution trends of OP and NP in sediments coincided with the following: (1) the beginnings of eutrophication (1960s/1970s of the twentieth century) and (2) strong increase in the areal extent and volume of hypoxia and anoxia in the Baltic (present century).

  8. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  9. Late-Holocene Environmental Reconstruction and Depositional History from a Taxodium Swamp near Lake Pontchartrain in Southern Louisiana

    Science.gov (United States)

    Ryu, J.; Bianchette, T. A.; Liu, K. B.; Yao, Q.; Maiti, K.

    2017-12-01

    The hydrological and environmental history of estuarine wetlands in Louisiana is not well-documented. To better understand the depositional processes in coastal wetlands, this study aims to reconstruct the environmental changes and document the occurrence of event deposits found in a bald cypress (Taxodium distichum) swamp approximately 800 m west of Lake Pontchartrain, a site susceptible to wind-generated storm surges as well as inundation from other fluvial and lacustrine processes. 210Pb analysis of a 59 cm sediment core (WMA-1) suggests that it has a sedimentation rate of 0.39 cm/year, consistent with the detection of a 137Cs peak at 17 cm from the core top. Results of sedimentological, geochemical, and palynological analyses reveal that the core contains two distinct sediment facies: an organic-rich dark brown peat unit from 0 to 29 cm containing low concentrations of terrestrial elements (e.g., Ti, Fe, and K), and a clay unit from 30 to 59 cm with elevated concentrations of most elements. Two thin clay layers, at 3-5 cm and 14-19 cm, embedded in the upper peat section are probably attributed to two recent storm events, Hurricane Isaac (2012) and Hurricane Gustav (2008), because both hurricanes caused heavy rain and significant storm-surge flooding at the study site. The pollen assemblage in the clay section is dominated by TCT (mainly Taxodium), but it is replaced by Salix and wetland herbaceous taxa in the overlying peat section. The multi-proxy data suggest that a cypress swamp has been present at the site for at least several hundred years but Taxodium was being replaced by willow (Salix) and other bottomland hardwood trees and wetland herbs as the water level dropped. Human activities may have been an important factor causing the hydrological and ecological changes at the site during the past century.

  10. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  11. Bayesian methods outperform parsimony but at the expense of precision in the estimation of phylogeny from discrete morphological data.

    Science.gov (United States)

    O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J

    2016-04-01

    Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. © 2016 The Authors.

  12. Reconstruction of the Evolutionary History and Dispersal of Usutu Virus, a Neglected Emerging Arbovirus in Europe and Africa.

    Science.gov (United States)

    Engel, Dimitri; Jöst, Hanna; Wink, Michael; Börstler, Jessica; Bosch, Stefan; Garigliany, Mutien-Marie; Jöst, Artur; Czajka, Christina; Lühken, Renke; Ziegler, Ute; Groschup, Martin H; Pfeffer, Martin; Becker, Norbert; Cadar, Daniel; Schmidt-Chanasit, Jonas

    2016-02-02

    Usutu virus (USUV), one of the most neglected Old World encephalitic flaviviruses, causes epizootics among wild and captive birds and sporadic infection in humans. The dynamics of USUV spread and evolution in its natural hosts are unknown. Here, we present the phylogeny and evolutionary history of all available USUV strains, including 77 newly sequenced complete genomes from a variety of host species at a temporal and spatial scaled resolution. The results showed that USUV can be classified into six distinct lineages and that the most recent common ancestor of the recent European epizootics emerged in Africa at least 500 years ago. We demonstrated that USUV was introduced regularly from Africa into Europe in the last 50 years, and the genetic diversity of European lineages is shaped primarily by in situ evolution, while the African lineages have been driven by extensive gene flow. Most of the amino acid changes are deleterious polymorphisms removed by purifying selection, with adaptive evolution restricted to the NS5 gene and several others evolving under episodic directional selection, indicating that the ecological or immunological factors were mostly the key determinants of USUV dispersal and outbreaks. Host-specific mutations have been detected, while the host transition analysis identified mosquitoes as the most likely origin of the common ancestor and birds as the source of the recent European USUV lineages. Our results suggest that the major migratory bird flyways could predict the continental and intercontinental dispersal patterns of USUV and that migratory birds might act as potential long-distance dispersal vehicles. Usutu virus (USUV), a mosquito-borne flavivirus of the Japanese encephalitis virus antigenic group, caused massive bird die-offs, mostly in Europe. There is increasing evidence that USUV appears to be pathogenic for humans, becoming a potential public health problem. The emergence of USUV in Europe allows us to understand how an arbovirus

  13. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...

  14. Coring of Karakel’ Lake sediments (Teberda River valley and prospects for reconstruction of glaciation and Holocene climate history in the Caucasus

    Directory of Open Access Journals (Sweden)

    O. N. Solomina

    2013-01-01

    Full Text Available Lacustrine sediments represent an important data source for glacial and palaeoclimatic reconstructions. Having a number of certain advantages, they can be successfully used as a means of specification of glacier situation and age of moraine deposits, as well as a basis for detailed climatic models of the Holocene. The article focuses on the coring of sediments of Lake Kakakel (Western Caucasus that has its goal to clarify the Holocene climatic history for the region, providing the sampling methods, lithologic description of the sediment core, obtained radiocarbon dating and the element composition of the sediments. The primary outlook over the results of coring of the sediments of the Lake Karakyol helped to reconsider the conventional opinion on the glacial fluctuations in the valley of Teberda and to assume the future possibility for high-definition palaeoclimatic reconstruction for Western Caucasus.

  15. Testing Pixel Translation Digital Elevation Models to Reconstruct Slip Histories: An Example from the Agua Blanca Fault, Baja California, Mexico

    Science.gov (United States)

    Wilson, J.; Wetmore, P. H.; Malservisi, R.; Ferwerda, B. P.; Teran, O.

    2012-12-01

    We use recently collected slip vector and total offset data from the Agua Blanca fault (ABF) to constrain a pixel translation digital elevation model (DEM) to reconstruct the slip history of this fault. This model was constructed using a Perl script that reads a DEM file (Easting, Northing, Elevation) and a configuration file with coordinates that define the boundary of each fault segment. A pixel translation vector is defined as a magnitude of lateral offset in an azimuthal direction. The program translates pixels north of the fault and prints their pre-faulting position to a new DEM file that can be gridded and displayed. This analysis, where multiple DEMs are created with different translation vectors, allows us to identify areas of transtension or transpression while seeing the topographic expression in these areas. The benefit of this technique, in contrast to a simple block model, is that the DEM gives us a valuable graphic which can be used to pose new research questions. We have found that many topographic features correlate across the fault, i.e. valleys and ridges, which likely have implications for the age of the ABF, long term landscape evolution rates, and potentially provide conformation for total slip assessments The ABF of northern Baja California, Mexico is an active, dextral strike slip fault that transfers Pacific-North American plate boundary strain out of the Gulf of California and around the "Big Bend" of the San Andreas Fault. Total displacement on the ABF in the central and eastern parts of the fault is 10 +/- 2 km based on offset Early-Cretaceous features such as terrane boundaries and intrusive bodies (plutons and dike swarms). Where the fault bifurcates to the west, the northern strand (northern Agua Blanca fault or NABF) is constrained to 7 +/- 1 km. We have not yet identified piercing points on the southern strand, the Santo Tomas fault (STF), but displacement is inferred to be ~4 km assuming that the sum of slip on the NABF and STF is

  16. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  17. A parametric reconstruction of the deceleration parameter

    Energy Technology Data Exchange (ETDEWEB)

    Al Mamon, Abdulla [Manipal University, Manipal Centre for Natural Sciences, Manipal (India); Visva-Bharati, Department of Physics, Santiniketan (India); Das, Sudipta [Visva-Bharati, Department of Physics, Santiniketan (India)

    2017-07-15

    The present work is based on a parametric reconstruction of the deceleration parameter q(z) in a model for the spatially flat FRW universe filled with dark energy and non-relativistic matter. In cosmology, the parametric reconstruction technique deals with an attempt to build up a model by choosing some specific evolution scenario for a cosmological parameter and then estimate the values of the parameters with the help of different observational datasets. In this paper, we have proposed a logarithmic parametrization of q(z) to probe the evolution history of the universe. Using the type Ia supernova, baryon acoustic oscillation and the cosmic microwave background datasets, the constraints on the arbitrary model parameters q{sub 0} and q{sub 1} are obtained (within 1σ and 2σ confidence limits) by χ{sup 2}-minimization technique. We have then reconstructed the deceleration parameter, the total EoS parameter ω{sub tot}, the jerk parameter and have compared the reconstructed results of q(z) with other well-known parametrizations of q(z). We have also shown that two model selection criteria (namely, the Akaike information criterion and Bayesian information criterion) provide a clear indication that our reconstructed model is well consistent with other popular models. (orig.)

  18. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  19. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  20. Dendro-chronological analysis of fossilised wood in order to reconstruct the post-ice-age history of glaciers; Dendrochronologische Auswertung fossiler Hoelzer zur Rekonstruktion der nacheiszeitlichen Gletschergeschichte

    Energy Technology Data Exchange (ETDEWEB)

    Holzhauser, H.

    2008-07-01

    Around the middle of the 19th century, alpine glaciers advanced to their last maximum extension within the Holocene (the last 11'600 years). Some of the glaciers, especially the Great Aletsch and Gorner, penetrated deeply into wooded land and destroyed numerous trees. Not only were trees destroyed but also valuable arable farmland, alpine farm buildings and dwelling houses. Since the last maximum extension in the 19th century the retreat of the glaciers has accelerated revealing, within the glacier forefields, the remainders of trees once buried. Some of this fossil wood is found in the place where it grew (in situ). Often the wood dates back to a time before the last glacier advance, most of it is several thousands of years old because glacial advance and retreat periods occurred repeatedly within the Holocene. This paper shows the characteristics of fossil wood and how it can be analysed to reconstruct glacial history. It will be demonstrated how glacier length variation can be exactly reconstructed with help of dendrochronology. Thanks to the very exact reconstruction of the glacier length change during the advance periods in the 14th and 16th centuries, the velocities of both the Gorner and Great Aletsch glaciers can be estimated. They range between 7-8 and 20 m per year, in the case of the Gorner glacier, and between 7-8 and 36 m per year, in the case of the Great Aletsch glacier. (author)

  1. 210Pb-derived ages for the reconstruction of terrestrial contaminant history into the Mexican Pacific coast: Potential and limitations

    International Nuclear Information System (INIS)

    Ruiz-Fernandez, A.C.; Hillaire-Marcel, C.

    2009-01-01

    210 Pb is widely used for dating recent sediments in the aquatic environment; however, our experiences working in shallow coastal environments in the Pacific coast of Mexico have demonstrated that the potential of 210 Pb for reliable historical reconstructions might be limited by the low 210 Pb atmospheric fallout, sediment mixing, abundance of coarse sediments and the lack of 137 Cs signal for 210 Pb corroboration. This work discusses the difficulties in obtaining adequate sedimentary records for geochronological reconstruction in such active and complex settings, including examples of 210 Pb geochronologies based on sediment profiles collected in two contrasting areas coastal areas (mudflats associated to coastal lagoons of Sinaloa State and the continental shelf of the Gulf of Tehuantepec), in which geochemical data was used to support the temporal frame established and the changes in sediment supply recorded in the sediment cores which were related to the development of land-based activities during the last century.

  2. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    Science.gov (United States)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  3. Traumatic brain injury and alcohol/substance abuse: A Bayesian meta-analysis comparing the outcomes of people with and without a history of abuse.

    Science.gov (United States)

    Unsworth, David J; Mathias, Jane L

    2017-08-01

    Alcohol and substance (drugs and/or alcohol) abuse are major risk factors for traumatic brain injury (TBI); however, it remains unclear whether outcomes differ for those with and without a history of preinjury abuse. A meta-analysis was performed to examine this issue. The PubMed, Embase, and PsycINFO databases were searched for research that compared the neuroradiological, cognitive, or psychological outcomes of adults with and without a documented history of alcohol and/or substance abuse who sustained nonpenetrating TBIs. Data from 22 studies were analyzed using a random-effects model: Hedges's g effect sizes measured the mean difference in outcomes of individuals with/without a history of preinjury abuse, and Bayes factors assessed the probability that the outcomes differed. Patients with a history of alcohol and/or substance abuse had poorer neuroradiological outcomes, including reduced hippocampal (g = -0.82) and gray matter volumes (g = -0.46 to -0.82), and enlarged cerebral ventricles (g = -0.73 to -0.80). There were limited differences in cognitive outcomes: Executive functioning (g = -0.51) and memory (g = -0.39 to -0.43) were moderately affected, but attention and reasoning were not. The findings for fine motor ability, construction, perception, general cognition, and language were inconclusive. Postinjury substance and alcohol use (g = -0.97 to -1.07) and emotional functioning (g = -0.29 to -0.44) were worse in those with a history of alcohol and/or substance abuse (psychological outcomes). This study highlighted the type and extent of post-TBI differences between persons with and without a history of alcohol or substance abuse, many of which may hamper recovery. However, variation in the criteria for premorbid abuse, limited information regarding the history of abuse, and an absence of preinjury baseline data prevented an assessment of whether the differences predated the TBI, occurred as a result of ongoing alcohol/substance abuse, or

  4. Bayesian Mediation Analysis

    OpenAIRE

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...

  5. Bayesian benefits with JASP

    NARCIS (Netherlands)

    Marsman, M.; Wagenmakers, E.-J.

    2017-01-01

    We illustrate the Bayesian approach to data analysis using the newly developed statistical software program JASP. With JASP, researchers are able to take advantage of the benefits that the Bayesian framework has to offer in terms of parameter estimation and hypothesis testing. The Bayesian

  6. Reconstructing the Phylogenetic History of Long-Term Effective Population Size and Life-History Traits Using Patterns of Amino Acid Replacement in Mitochondrial Genomes of Mammals and Birds

    Science.gov (United States)

    Nabholz, Benoit; Lartillot, Nicolas

    2013-01-01

    The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more

  7. Niche partitioning in sympatric Gorilla and Pan from Cameroon: implications for life history strategies and for reconstructing the evolution of hominin life history.

    Science.gov (United States)

    Macho, Gabriele A; Lee-Thorp, Julia A

    2014-01-01

    Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%), relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature) until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a) more gradual and (b) earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the results highlight

  8. Constraining the Deforestation History of Europe: Evaluation of Historical Land Use Scenarios with Pollen-Based Land Cover Reconstructions

    Directory of Open Access Journals (Sweden)

    Jed O. Kaplan

    2017-12-01

    Full Text Available Anthropogenic land cover change (ALCC is the most important transformation of the Earth system that occurred in the preindustrial Holocene, with implications for carbon, water and sediment cycles, biodiversity and the provision of ecosystem services and regional and global climate. For example, anthropogenic deforestation in preindustrial Eurasia may have led to feedbacks to the climate system: both biogeophysical, regionally amplifying winter cold and summer warm temperatures, and biogeochemical, stabilizing atmospheric CO 2 concentrations and thus influencing global climate. Quantification of these effects is difficult, however, because scenarios of anthropogenic land cover change over the Holocene vary widely, with increasing disagreement back in time. Because land cover change had such widespread ramifications for the Earth system, it is essential to assess current ALCC scenarios in light of observations and provide guidance on which models are most realistic. Here, we perform a systematic evaluation of two widely-used ALCC scenarios (KK10 and HYDE3.1 in northern and part of central Europe using an independent, pollen-based reconstruction of Holocene land cover (REVEALS. Considering that ALCC in Europe primarily resulted in deforestation, we compare modeled land use with the cover of non-forest vegetation inferred from the pollen data. Though neither land cover change scenario matches the pollen-based reconstructions precisely, KK10 correlates well with REVEALS at the country scale, while HYDE systematically underestimates land use with increasing magnitude with time in the past. Discrepancies between modeled and reconstructed land use are caused by a number of factors, including assumptions of per-capita land use and socio-cultural factors that cannot be predicted on the basis of the characteristics of the physical environment, including dietary preferences, long-distance trade, the location of urban areas and social organization.

  9. Vegetation history reconstructed from anthracology and pollen analysis at the rescue excavation of the MO Motorway, Hungary

    Science.gov (United States)

    Náfrádi, Katalin; Bodor, Elvira; Törőcsik, Tünde; Sümegi, Pál

    2011-12-01

    The significance of geoarchaeological investigations is indisputable in reconstructing the former environment and in studying the relationship between humans and their surroundings. Several disciplines have developed during the last few decades to give insight into earlier time periods and their climatic conditions (e.g. palynology, malacology, archaeobotany, phytology and animal osteology). Charcoal and pollen analytical studies from the rescue excavation of the MO motorway provide information about the vegetation changes of the past. These methods are used to reconstruct the environment of the former settlements and to detect the human impact and natural climatic changes. The sites examined span the periods of the Late-Copper Age, Late-Bronze Age, Middle-Iron Age, Late-Iron Age, Sarmatian period, Late Sarmatian period, Migration period, Late-Migration period and Middle Ages. The vegetation before the Copper Age is based only on pollen analytical data. Anthracological results show the overall dominance of Quercus and a great number of Ulmus, Fraxinus, Acer, Fagus, Alnus and Populus/Salix tree fossils, as well as the residues of fruit trees present in the charred wood assemblage.

  10. Niche partitioning in sympatric Gorilla and Pan from Cameroon: implications for life history strategies and for reconstructing the evolution of hominin life history.

    Directory of Open Access Journals (Sweden)

    Gabriele A Macho

    Full Text Available Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%, relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a more gradual and (b earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the

  11. Toward a comprehensive phylogenetic reconstruction of the evolutionary history of mitogen-activated protein kinases in the plant kingdom.

    Science.gov (United States)

    Janitza, Philipp; Ullrich, Kristian Karsten; Quint, Marcel

    2012-01-01

    The mitogen-activated protein kinase (MAPK) pathway is a three-tier signaling cascade that transmits cellular information from the plasma membrane to the cytoplasm where it triggers downstream responses. The MAPKs represent the last step in this cascade and are activated when both tyrosine and threonine residues in a conserved TxY motif are phosphorylated by MAPK kinases, which in turn are themselves activated by phosphorylation by MAPK kinase kinases. To understand the molecular evolution of MAPKs in the plant kingdom, we systematically conducted a Hidden-Markov-Model based screen to identify MAPKs in 13 completely sequenced plant genomes. In this analysis, we included green algae, bryophytes, lycophytes, and several mono- and eudicotyledonous species covering >800 million years of evolution. The phylogenetic relationships of the 204 identified MAPKs based on Bayesian inference facilitated the retraction of the sequence of emergence of the four major clades that are characterized by the presence of a TDY or TEY-A/TEY-B/TEY-C type kinase activation loop. We present evidence that after the split of TDY- and TEY-type MAPKs, initially the TEY-C clade emerged. This was followed by the TEY-B clade in early land plants until the TEY-A clade finally emerged in flowering plants. In addition to these well characterized clades, we identified another highly conserved clade of 45 MAPK-likes, members of which were previously described as Mak-homologous kinases. In agreement with their essential functions, molecular population genetic analysis of MAPK genes in Arabidopsis thaliana accessions reveal that purifying selection drove the evolution of the MAPK family, implying strong functional constraints on MAPK genes. Closely related MAPKs most likely subfunctionalized, a process in which differential transcriptional regulation of duplicates may be involved.

  12. Towards a comprehensive phylogenetic reconstruction of the evolutionary history of mitogen-activated protein kinases in the plant kingdom

    Directory of Open Access Journals (Sweden)

    Philipp eJanitza

    2012-12-01

    Full Text Available The mitogen-activated protein kinase (MAPK pathway is a three-tier signaling cascade that transmits cellular information from the plasma membrane to the cytoplasm where it triggers downstream responses. The MAPKs represent the last step in this cascade and are activated when both tyrosine and threonine residues in a conserved TxY motif are phosphorylated by MAPK kinases, which in turn are themselves activated by phosphorylation by MAPK kinase kinases. To understand the molecular evolution of MAPKs in the plant kingdom, we systematically conducted a Hidden-Markov-Model based screen to identify MAPKs in 13 completely sequenced plant genomes. In this analysis, we included green algae, bryophytes, lycophytes, and several mono- and dicotyledonous species covering >800 million years of evolution. The phylogenetic relationships of the 204 identified MAPKs based on Bayesian inference facilitated the retraction of the sequence of emergence of the four major clades that are characterized by the presence of a TDY or TEY-A/TEY-B/TEY-C type kinase activation loop. We present evidence that after the split of TDY- and TEY-type MAPKs, initially the TEY-C clade emerged. This was followed by the TEY-B clade in early land plants until the TEY-A clade finally emerged in flowering plants. In addition to these well characterized clades, we identified another highly conserved clade of 45 MAPK-likes, members of which were previously described as MHKs. In agreement with their essential functions, molecular population genetic analysis of MAPK genes in Arabidopsis thaliana accessions reveal that purifying selection drove the evolution of the MAPK family, implying strong functional constraints on MAPK genes. Closely related MAPKs most likely subfunctionalized, a process in which differential transcriptional regulation of duplicates may be involved.

  13. Bayesian image restoration for medical images using radon transform

    International Nuclear Information System (INIS)

    Shouno, Hayaru; Okada, Masato

    2010-01-01

    We propose an image reconstruction algorithm using Bayesian inference for Radon transformed observation data, which often appears in the field of medical image reconstruction known as computed tomography (CT). In order to apply our Bayesian reconstruction method, we introduced several hyper-parameters that control the ratio between prior information and the fidelity of the observation process. Since the quality of the reconstructed image is influenced by the estimation accuracy of these hyper-parameters, we propose an inference method for them based on the marginal likelihood maximization principle as well as the image reconstruction method. We are able to demonstrate a reconstruction result superior to that obtained using the conventional filtered back projection method. (author)

  14. (No) Limits to Anglo-American Accounting? Reconstructing the History of the International Accounting Standards Committee ; A Review Article

    OpenAIRE

    Botzem, S.; Quack, S.

    2009-01-01

    The development of the current International Accounting Standards Board (IASB) from the earlier International Accounting Standards Committee (IASC) provides insight into many issues of international financial reporting, among them the characteristics of international accounting standards themselves. This article reviews Camfferman and Zeff’s [Camfferman, K., & Zeff, S. A. (2007). Financial reporting and global capital markets. A history of the international accounting standards committee 1973...

  15. The Last Hundred Years of Land Use History in the Southern Part of Valdai Hills (European Russia: Reconstruction by Pollen and Historical Data

    Directory of Open Access Journals (Sweden)

    Novenko Elena

    2017-12-01

    Full Text Available The last one hundred years of land use history in the southern part of Valdai Hills (European Russia were reconstructed on the base of high resolution pollen data from the peat monolith taken from the Central Forest State Reserve supplementing with historical records derived from maps of the General Land Survey of the 18th and 19th centuries and satellite images. According to the created age model provided by dating using radio-nuclides 210Pb and 137Cs, pollen data of the peat monolith allow us to reconstruct vegetation dynamics during the last one hundred years with high time resolution. The obtained data showed that, despite the location of the studied peatland in the center of the forest area and rather far away from possible croplands and hayfields, the pollen values of plants – anthropogenic indicators (Secale sereale, Centaurea cyanus, Plantago, Rumex, etc. and micro-charcoal concentration are relatively high in the period since the beginning of the 20th century to the 1970s, especially in the peat horizon formed in the 1950s. In the late 1970s – the early 1980s when the pollen values of cereals gradually diminished in assemblages, the quantity of pollen of other anthropogenic indicators were also significantly reduced, which reflects the overall processes of the agriculture decline in the forest zone of the former USSR.

  16. On history of medical radiology development in Ukraine: war period and after war reconstruction (1941-1947)

    International Nuclear Information System (INIS)

    Pilipenko, M.Yi.; Artamonova, N.O.; Busigyina, N.O.; Kononenko, O.K.

    1994-01-01

    The paper is devoted to history of Ukrainian medical radiology development, namely main problems of its scientific and practical aspects of development during 1941-1947. The authors describe the work of Ukrainian roentgenologists and radiologists during the war and after war restoration of radiological service. Contribution of Ukrainian scientists to practical and theoretical achievements of military roentgenology is shown. Operative mobilization of all the forces for restoration of destroyed during the war years roentgenological service allowed to start scientific research within a short term (already in 1945)

  17. Reconstructing lake evaporation history and the isotopic composition of precipitation by a coupled δ18O-δ2H biomarker approach

    Science.gov (United States)

    Hepp, Johannes; Tuthorn, Mario; Zech, Roland; Mügler, Ines; Schlütz, Frank; Zech, Wolfgang; Zech, Michael

    2015-10-01

    Over the past decades, δ18O and δ2H analyses of lacustrine sediments became an invaluable tool in paleohydrology and paleolimnology for reconstructing the isotopic composition of past lake water and precipitation. However, based on δ18O or δ2H records alone, it can be challenging to distinguish between changes of the precipitation signal and changes caused by evaporation. Here we propose a coupled δ18O-δ2H biomarker approach that provides the possibility to disentangle between these two factors. The isotopic composition of long chain n-alkanes (n-C25, n-C27, n-C29, n-C31) were analyzed in order to establish a 16 ka Late Glacial and Holocene δ2H record for the sediment archive of Lake Panch Pokhari in High Himalaya, Nepal. The δ2Hn-alkane record generally corroborates a previously established δ18Osugar record reporting on high values characterizing the deglaciation and the Older and the Younger Dryas, and low values characterizing the Bølling and the Allerød periods. Since the investigated n-alkane and sugar biomarkers are considered to be primarily of aquatic origin, they were used to reconstruct the isotopic composition of lake water. The reconstructed deuterium excess of lake water ranges from +57‰ to -85‰ and is shown to serve as proxy for the evaporation history of Lake Panch Pokhari. Lake desiccation during the deglaciation, the Older Dryas and the Younger Dryas is affirmed by a multi-proxy approach using the Hydrogen Index (HI) and the carbon to nitrogen ratio (C/N) as additional proxies for lake sediment organic matter mineralization. Furthermore, the coupled δ18O and δ2H approach allows disentangling the lake water isotopic enrichment from variations of the isotopic composition of precipitation. The reconstructed 16 ka δ18Oprecipitation record of Lake Panch Pokhari is well in agreement with the δ18O records of Chinese speleothems and presumably reflects the Indian Summer Monsoon variability.

  18. Reconstructing the sedimentation history of the Bengal Delta Plain by means of geochemical and stable isotopic data

    International Nuclear Information System (INIS)

    Neidhardt, Harald; Biswas, Ashis; Freikowski, Dominik; Majumder, Santanu; Chatterjee, Debashis; Berner, Zsolt A.

    2013-01-01

    The purpose of this study is to examine the sedimentation history of the central floodplain area of the Bengal Delta Plain in West Bengal, India. Sediments from two boreholes were analyzed regarding lithology, geochemistry and the stable isotopic composition of embedded organic matter. Different lithofacies were distinguished that reflect frequent changes in the prevailing sedimentary depositional environment of the study area. The lowest facies comprises poorly sorted fluvial sediments composed of fine gravel to clay pointing at high transport energy and intense relocation processes. This facies is considered to belong to an early Holocene lowstand systems tract that followed the last glacial maximum. Fine to medium sands above it mark a gradual change towards a transgressive systems tract. Upwards increasing proportions of silt and the stable isotopic composition of embedded organic matter both indicate a gradual change from fluvial channel infill sediments towards more estuarine and marine influenced deposits. Youngest sediments are composed of clayey and silty overbank deposits of the Hooghly River that have formed a vast low-relief delta-floodplain. Close to the surface, small concretions of secondary Mn-oxides and Fe-(oxyhydr)oxides occur and mark the fluctuation range of the unsaturated zone. These concretions are accompanied by relatively high contents of trace elements such as Zn, Ni, Cu, and As. To sum up, the outcomes of this study provide new insights into the complex sedimentation history of the barely investigated central floodplain area of West Bengal

  19. Progress on Bayesian Inference of the Fast Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W,; Chen, X.

    2013-01-01

    . However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and weight functions that describe the phase space...... sensitivity of the measurements are incorporated into Bayesian likelihood probabilities. Prior probabilities describe physical constraints. This poster will show reconstructions of classically described, low-power, MHD-quiescent distribution functions from actual FIDA measurements. A description of the full...

  20. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  1. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  2. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    International Nuclear Information System (INIS)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida

    2011-01-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, k 0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  3. Cryogenic brines as diagenetic fluids: Reconstructing the diagenetic history of the Victoria Land Basin using clumped isotopes

    Science.gov (United States)

    Staudigel, Philip T.; Murray, Sean; Dunham, Daniel P.; Frank, Tracy D.; Fielding, Christopher R.; Swart, Peter K.

    2018-03-01

    The isotopic analyses (δ13C, δ18O, and Δ47) of carbonate phases recovered from a core in McMurdo Sound by ANtarctic geologic DRILLing (ANDRILL-2A) indicate that the majority of secondary carbonate mineral formation occurred at cooler temperatures than the modern burial temperature, and in the presence of fluids with δ18Owater values ranging between -11 and -6‰ VSMOW. These fluids are interpreted as being derived from a cryogenic brine formed during the freezing of seawater. The Δ47 values were converted to temperature using an in-house calibration presented in this paper. Measurements of the Δ47 values in the cements indicate increasingly warmer crystallization temperatures with depth and, while roughly parallel to the observed geothermal gradient, consistently translate to temperatures that are cooler than the current burial temperature. The difference in temperature suggests that cements formed when they were ∼260 ± 100 m shallower than at the present day. This depth range corresponds to a period of minimal sediment accumulation from 3 to 11 Myr; it is therefore interpreted that the majority of cements formed during this time. This behavior is also predicted by time-integrated modeling of cementation at this site. If this cementation had occurred in the presence of these fluids, then the cryogenic brines have been a longstanding feature in the Victoria Land Basin. Brines such as those found at this site have been described in numerous modern high-latitude settings, and analogous fluids could have played a role in the diagenetic history of other ice-proximal sediments and basins during glacial intervals throughout geologic history. The agreement between the calculated δ18Owater value and the measured values in the pore fluids shows how the Δ47 proxy can be used to identify the origin of negative δ18O values in carbonate rocks and that extremely negative values do not necessarily need to be a result of the influence of meteoric fluids or reaction at

  4. Combining Livestock Production Information in a Process-Based Vegetation Model to Reconstruct the History of Grassland Management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; hide

    2016-01-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5deg by 0.5deg. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 19012012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, risingCO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 x 10(exp 6) km(exp 2) in 1901 to 12.3 x 10(exp 6) kmI(exp 2) in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and inter-annual variability of grassland productivity at global

  5. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  6. Holocene history and environmental reconstruction of a Hercynian mire and surrounding mountain landscape based on multiple proxies

    Science.gov (United States)

    Dudová, Lydie; Hájková, Petra; Opravilová, Věra; Hájek, Michal

    2014-07-01

    We discovered the first peat section covering the entire Holocene in the Hrubý Jeseník Mountains, representing an island of unique alpine vegetation whose history may display transitional features between the Hercynian and Carpathian regions. We analysed pollen, plant macrofossils (more abundant in bottom layers), testate amoebae (more abundant in upper layers), peat stratigraphy and chemistry. We found that the landscape development indeed differed from other Hercynian mountains located westward. This is represented by Pinus cembra and Larix during the Pleistocene/Holocene transition, the early expansion of spruce around 10,450 cal yr BP, and survival of Larix during the climatic optimum. The early Holocene climatic fluctuations are traced in our profile by species compositions of both the mire and surrounding forests. The mire started to develop as a calcium-rich percolation fen with some species recently considered to be postglacial relicts (Meesia triquetra, Betula nana), shifted into ombrotrophy around 7450 cal yr BP by autogenic succession and changed into a pauperised, nutrient-enriched spruce woodland due to modern forestry activities. We therefore concluded that its recent vegetation is not a product of natural processes. From a methodological viewpoint we demonstrated how using multiple biotic proxies and extensive training sets in transfer functions may overcome taphonomic problems.

  7. New directions in hydro-climatic histories: observational data recovery, proxy records and the atmospheric circulation reconstructions over the earth (ACRE) initiative in Southeast Asia

    Science.gov (United States)

    Williamson, Fiona; Allan, Rob; Switzer, Adam D.; Chan, Johnny C. L.; Wasson, Robert James; D'Arrigo, Rosanne; Gartner, Richard

    2015-12-01

    The value of historic observational weather data for reconstructing long-term climate patterns and the detailed analysis of extreme weather events has long been recognized (Le Roy Ladurie, 1972; Lamb, 1977). In some regions however, observational data has not been kept regularly over time, or its preservation and archiving has not been considered a priority by governmental agencies. This has been a particular problem in Southeast Asia where there has been no systematic country-by-country method of keeping or preserving such data, the keeping of data only reaches back a few decades, or where instability has threatened the survival of historic records. As a result, past observational data are fragmentary, scattered, or even absent altogether. The further we go back in time, the more obvious the gaps. Observational data can be complimented however by historical documentary or proxy records of extreme events such as floods, droughts and other climatic anomalies. This review article highlights recent initiatives in sourcing, recovering, and preserving historical weather data and the potential for integrating the same with proxy (and other) records. In so doing, it focuses on regional initiatives for data research and recovery - particularly the work of the international Atmospheric Circulation Reconstructions over the Earth's (ACRE) Southeast Asian regional arm (ACRE SEA) - and the latter's role in bringing together disparate, but interrelated, projects working within this region. The overarching goal of the ACRE SEA initiative is to connect regional efforts and to build capacity within Southeast Asian institutions, agencies and National Meteorological and Hydrological Services (NMHS) to improve and extend historical instrumental, documentary and proxy databases of Southeast Asian hydroclimate, in order to contribute to the generation of high-quality, high-resolution historical hydroclimatic reconstructions (reanalyses) and, to build linkages with humanities researchers

  8. HIGH REPETITION JUMP TRAINING COUPLED WITH BODY WEIGHT SUPPORT IN A PATIENT WITH KNEE PAIN AND PRIOR HISTORY OF ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION: A CASE REPORT.

    Science.gov (United States)

    Elias, Audrey R C; Kinney, Anthony E; Mizner, Ryan L

    2015-12-01

    Patients frequently experience long-term deficits in functional activity following anterior cruciate ligament reconstruction, and commonly present with decreased confidence and poor weight acceptance in the surgical knee. Adaptation of neuromuscular behaviors may be possible through plyometric training. Body weight support decreases intensity of landing sufficiently to allow increased training repetition. The purpose of this case report is to report the outcomes of a subject with a previous history of anterior cruciate ligament (ACL) reconstruction treated with high repetition jump training coupled with body weight support (BWS) as a primary intervention strategy. A 23-year old female, who had right ACL reconstruction seven years prior, presented with anterior knee pain and effusion following initiation of a running program. Following visual assessment of poor mechanics in single leg closed chain activities, landing mechanics were assessed using 3-D motion analysis of single leg landing off a 20 cm box. She then participated in an eight-week plyometric training program using a custom-designed body weight support system. The International Knee Documentation Committee Subjective Knee Form (IKDC) and the ACL-Return to Sport Index (ACL-RSI) were administered at the start and end of treatment as well as at follow-up testing. The subject's IKDC and ACL-RSI scores increased with training from 68% and 43% to 90% and 84%, respectively, and were retained at follow-up testing. Peak knee and hip flexion angles during landing increased from 47 ° and 53 ° to 72 ° and 80 ° respectively. Vertical ground reaction forces in landing decreased with training from 3.8 N/kg to 3.2 N/kg. All changes were retained two months following completion of training. The subject experienced meaningful changes in overall function. Retention of mechanical changes suggests that her new landing strategy had become a habitual pattern. Success with high volume plyometric training is

  9. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  10. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  11. ICDP project DeepCHALLA: reconstructing East African climate change and environmental history over the past 250,000 years

    Science.gov (United States)

    Verschuren, Dirk; Van Daele, Maarten; Wolff, Christian; Waldmann, Nicolas; Meyer, Inka; Ombori, Titus; Peterse, Francien; O'Grady, Ryan; Schnurrenberger, Doug; Olago, Daniel

    2017-04-01

    Sediments on the bottom of Lake Challa, a 92-meter deep crater lake on the border of Kenya and Tanzania near Mt. Kilimanjaro, contain a uniquely long and continuous record of past climate and environmental change. The near-equatorial location and exceptional quality of this natural archive provide great opportunities to study tropical climate variability at both short (inter-annual to decadal) and long (glacial-interglacial) time scales; and the influence of this climate variability on the region's freshwater resources, the functioning of terrestrial ecosystems, and the history of the East African landscape in which modern humans (our species, Homo sapiens) evolved and have lived ever since. Supported in part by the International Continental Scientific Drilling Programme (ICDP), the DeepCHALLA project has now recovered the sediment record of Lake Challa down to 214.8 meter below the lake floor, with almost certain 100% cover of the uppermost 121.3 meter (ca.150,000 year BP to present) and estimated 85% cover over the lower part of the sequence, down to the lowermost distinct reflector in the available seismic stratigraphy. This reflector represents a 2 meter thick layer of volcanic sand and silt deposited ca.250,000 years ago, and overlies still older silty lacustrine clays deposited during early lake development. Down-hole logging produced continuous profiles of in-situ sediment composition that confer an absolute depth scale to both the recovered cores and their three-dimensional representation in seismic stratigraphy. As readily observed through the transparent core liners, Lake Challa sediments are finely laminated throughout most of the recovered sequence. Combined with the great time span, the exquisite temporal resolution of these sediments promises to greatly increase our understanding of tropical climate and ecosystem dynamics, and create a long-awaited equatorial counterpart to the high-latitude climate records extracted from the ice sheets of Greenland

  12. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  13. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: first report from Tropical Asia Core (TACO) project.

    Science.gov (United States)

    Boonyatumanond, Ruchaya; Wattayakorn, Gullaya; Amano, Atsuko; Inouchi, Yoshio; Takada, Hideshige

    2007-05-01

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of (137)Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform.

  14. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    International Nuclear Information System (INIS)

    Boonyatumanond, Ruchaya; Wattayakorn, Gullaya; Amano, Atsuko; Inouchi, Yoshio; Takada, Hideshige

    2007-01-01

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of 137 Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform

  15. Thermal-history reconstruction of the Baiyun Sag in the deep-water area of the Pearl River Mouth Basin, northern South China Sea

    Science.gov (United States)

    Tang, Xiaoyin; Yang, Shuchun; Hu, Shengbiao

    2017-11-01

    The Baiyun Sag, located in the deep-water area of the northern South China Sea, is the largest and deepest subbasin in the Pearl River Mouth Basin and one of the most important hydrocarbon-accumulation depression areas in China. Thermal history is widely thought to be of great importance in oil and gas potential assessment of a basin as it controls the timing of hydrocarbon generation and expulsion from the source rock. In order to unravel the paleo-heat flow of the Baiyun Sag, we first analyzed tectonic subsidence of 55 pseudo-wells constructed based on newly interpreted seismic profiles, along with three drilled wells. We then carried out thermal modeling using the multi-stage finite stretching method and calibrated the results using collected present-day vitrinite reflectance data and temperature data. Results indicate that the first and second heating of the Baiyun Sag after 49 Ma ceased at 33.9 Ma and 23 Ma. Reconstructed average basal paleoheat flow values at the end of the rifting periods are 57.7-86.2 mW/m2 and 66.7-97.3 mW/m2, respectively. Following the last heating period at 23 Ma, the study area has undergone a persistent thermal attenuation phase, and basal heat flow has cooled down to 64.0-79.2 mW/m2 at present.

  16. The Geologic History of Lake of the Woods, Minnesota, Reconstructed Using Seismic-Reflection Imaging and Sediment Core Analysis

    Science.gov (United States)

    Hougardy, Devin D.

    The history of glacial Lake Agassiz is complex and has intrigued researchers for over a century. Over the course of its ˜5,000 year existence, the size, shape, and location of Lake Agassiz changed dramatically depending on the location of the southern margin of the Laurentide Ice Sheet (LIS), the location and elevation of outflow channels, and differential isostatic rebound. Some of the best-preserved sequences of Lake Agassiz sediments are found in remnant lake basins where erosional processes are less pronounced than in adjacent higher-elevation regions. Lake of the Woods (LOTW), Minnesota, is among the largest of the Lake Agassiz remnant lakes and is an ideal location for Lake Agassiz sediment accumulation. High-resolution seismic-reflection (CHIRP) data collected from the southern basin of LOTW reveal up to 28 m of stratified lacustrine sediment deposited on top of glacial diamicton and bedrock. Five seismic units (SU A-E) were identified and described based on their reflection character, reflection configuration, and external geometries. Three prominent erosional unconformities (UNCF 1-3) underlie the upper three seismic units and indicate that deposition at LOTW was interrupted by a series of relatively large fluctuations in lake level. The lowermost unconformity (UNCF-1) truncates uniformly draped reflections within SU-B at the margins of the basin, where as much as four meters of sediment were eroded. The drop in lake level is interpreted to be contemporaneous with the onset of the low-stand Moorhead phase of Lake Agassiz identified from subaerial deposits in the Red River Valley, Rainy River basin, and Lake Winnipeg. A rise in lake level, indicated by onlapping reflections within SU-C onto UNCF-1, shifted the wave base outwards and as much as 11 m of sediment were deposited (SU-C) in the middle of the basin before a second drop, and subsequent rise, in lake level resulted in the formation of UNCF-2. Reflections in the lower part of SU-D onlap onto UNCF-2

  17. Inverse problems in the Bayesian framework

    International Nuclear Information System (INIS)

    Calvetti, Daniela; Somersalo, Erkki; Kaipio, Jari P

    2014-01-01

    The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian

  18. A phylogenetic Kalman filter for ancestral trait reconstruction using molecular data.

    Science.gov (United States)

    Lartillot, Nicolas

    2014-02-15

    Correlation between life history or ecological traits and genomic features such as nucleotide or amino acid composition can be used for reconstructing the evolutionary history of the traits of interest along phylogenies. Thus far, however, such ancestral reconstructions have been done using simple linear regression approaches that do not account for phylogenetic inertia. These reconstructions could instead be seen as a genuine comparative regression problem, such as formalized by classical generalized least-square comparative methods, in which the trait of interest and the molecular predictor are represented as correlated Brownian characters coevolving along the phylogeny. Here, a Bayesian sampler is introduced, representing an alternative and more efficient algorithmic solution to this comparative regression problem, compared with currently existing generalized least-square approaches. Technically, ancestral trait reconstruction based on a molecular predictor is shown to be formally equivalent to a phylogenetic Kalman filter problem, for which backward and forward recursions are developed and implemented in the context of a Markov chain Monte Carlo sampler. The comparative regression method results in more accurate reconstructions and a more faithful representation of uncertainty, compared with simple linear regression. Application to the reconstruction of the evolution of optimal growth temperature in Archaea, using GC composition in ribosomal RNA stems and amino acid composition of a sample of protein-coding genes, confirms previous findings, in particular, pointing to a hyperthermophilic ancestor for the kingdom. The program is freely available at www.phylobayes.org.

  19. A History of Anterior Cruciate Ligament Reconstruction at the National Football League Combine Results in Inferior Early National Football League Career Participation.

    Science.gov (United States)

    Provencher, Matthew T; Bradley, James P; Chahla, Jorge; Sanchez, Anthony; Beaulieu-Jones, Brendin R; Arner, Justin W; Kennedy, Nicholas I; Sanchez, George; Kennedy, Mitchell I; Moatshe, Gilbert; Cinque, Mark E; LaPrade, Robert F

    2018-05-19

    To evaluate whether players with a history of an anterior cruciate ligament reconstruction (ACLR) before the National Football League (NFL) Combine played or started fewer games and/or participated in fewer eligible snaps compared with NFL Combine participants without a history of knee injury or surgery. We performed a retrospective review of all players who participated in the NFL Combine between 2009 and 2015 and who had a history of an ACLR. NFL Combine participants were included if they had a previous ACLR or combined anterior cruciate ligament (ACL) injury and nonoperatively managed medial collateral ligament injury. The number of games started, number of games played, draft number, overall draft pick, and snap percentage for each position were determined. The mean value of each outcome metric was compared between case and control players. We identified 110 players who had an ACL injury (n = 76) or a combined ACL and medial collateral ligament injury (n = 34). Players in the ACLR group had a significantly worse mean draft pick number (difference of 30.2, P = .002) and mean draft round (difference of 0.8, P = .019) versus controls. Compared with control players, players in the ACLR group started and played significantly fewer games in both season 1 (difference of 2.7 games started, P < .001; difference of 2.7 games played, P < .001) and season 2 (difference of 7.4 games started, P < .001; difference of 3.0 games played, P = .003) and had a significantly lower snap percentage in both season 1 (difference of 23.1%, P < .001) and season 2 (difference of 24.0%, P < .001). Athletes at the NFL Combine who previously underwent an ACLR had significantly lower early-career NFL player metrics, including fewer games started, fewer games played, and a lower snap percentage, than uninjured controls. Defensive linemen, defensive backs, and linebackers were the 3 most affected positions. Players with a prior ACLR and combined meniscal-chondral pathology had

  20. Bayesian computation with R

    CERN Document Server

    Albert, Jim

    2009-01-01

    There has been a dramatic growth in the development and application of Bayesian inferential methods. Some of this growth is due to the availability of powerful simulation-based algorithms to summarize posterior distributions. There has been also a growing interest in the use of the system R for statistical analyses. R's open source nature, free availability, and large number of contributor packages have made R the software of choice for many statisticians in education and industry. Bayesian Computation with R introduces Bayesian modeling by the use of computation using the R language. The earl

  1. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  2. Entangled histories

    International Nuclear Information System (INIS)

    Cotler, Jordan; Wilczek, Frank

    2016-01-01

    We introduce quantum history states and their mathematical framework, thereby reinterpreting and extending the consistent histories approach to quantum theory. Through thought experiments, we demonstrate that our formalism allows us to analyze a quantum version of history in which we reconstruct the past by observations. In particular, we can pass from measurements to inferences about ‘what happened’ in a way that is sensible and free of paradox. Our framework allows for a richer understanding of the temporal structure of quantum theory, and we construct history states that embody peculiar, non-classical correlations in time. (paper)

  3. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  4. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  5. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  6. Bayesian psychometric scaling

    NARCIS (Netherlands)

    Fox, Gerardus J.A.; van den Berg, Stéphanie Martine; Veldkamp, Bernard P.; Irwing, P.; Booth, T.; Hughes, D.

    2015-01-01

    In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item

  7. Bayesian Networks An Introduction

    CERN Document Server

    Koski, Timo

    2009-01-01

    Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni

  8. A Bayesian encourages dropout

    OpenAIRE

    Maeda, Shin-ichi

    2014-01-01

    Dropout is one of the key techniques to prevent the learning from overfitting. It is explained that dropout works as a kind of modified L2 regularization. Here, we shed light on the dropout from Bayesian standpoint. Bayesian interpretation enables us to optimize the dropout rate, which is beneficial for learning of weight parameters and prediction after learning. The experiment result also encourages the optimization of the dropout.

  9. Basics of Bayesian methods.

    Science.gov (United States)

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  10. The NIFTY way of Bayesian signal inference

    International Nuclear Information System (INIS)

    Selig, Marco

    2014-01-01

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D 3 PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy

  11. The NIFTy way of Bayesian signal inference

    Science.gov (United States)

    Selig, Marco

    2014-12-01

    We introduce NIFTy, "Numerical Information Field Theory", a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTy can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTy as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  12. Reconstruction of the Exposure Histories of 20 Allan Hills Ordinary Chondrites on the Basis of Cosmogenic 10Be, 26Al, Noble Gases, and Cosmic Ray Tracks

    Science.gov (United States)

    Neupert, U.; Knauer, M.; Michel, R.; Loeken, Th.; Schultz, L.; Dittrich-Hannen, B.; Suter, M.; Kubik, P. W.; Metzler, K.; Romstedt, J.

    1995-09-01

    Twenty ordinary chondrites from the 1988/89 meteorite search (ALH 88004, 88008, 88010, 88011, 88013, 88016 to 88021, 88026 to 88031, 88033, 88039, 88042) [1,2] were investigated for 10Be and 26Al, and for He, Ne and Ar by accelerator and rare gas mass spectrometry, respectively. Cosmic ray tracks were measured in samples of ALH 88019. Using theoretical production rates calculated by a physical model [3] the experimental data are interpreted with respect to the reconstruction of the preatmospheric exposure conditions and exposure histories of the meteoroids. Ordinary chondrites are particularly well suited to exemplify the capabilities of an interpretation of many cosmogenic nuclides measured in one sample. Model calculations of GCR production rates were performed for 10Be, 26Al, 3He, 21Ne, 22Ne and 38Ar as reported elsewhere [4,5]. For all meteorites, except for ALH 88019, the cosmogenic nuclide data can be explained by simple one stage exposure histories between 3 Ma and 44 Ma in meteoroids with radii between 5 cm and 85 cm. Exposure ages were derived from cosmogenic 3He, 21Ne and 38Ar on the basis of the theoretical production rates as function of3He/21Ne and 22Ne/21Ne as well as on the empirical ones proposed by Eugster [6]. The average ratios of exposure ages determined from theoretical production rates to those calculated according to Eugster [6] were 1.08+/-0.11, 1.11+/-0.25 and 1.12+/-0.17 in case of 3He, 21Ne and 38Ar, respectively. Repeated measurements of 10Be and 26Al in ALH 88019 resulted in 10.4+/-1.3 dpm/kg and 5.6+/-0.5 dpm/kg, respectively. But, the cosmogenic rare gas concentrations point to a (single stage) exposure age of 39 Ma in a meteoroid. This is in accordance with a measured cosmic ray track density in olivine of 2.8 * 10^6 cm^-2. The samples are from depths betwen 3 cm and 8 cm. Based on the track data we obtain a minimum meteoroid radius of 8 cm. The low 10Be and 26Al cannot be explained by a one stage exposure history and a long

  13. Genome rearrangements and phylogeny reconstruction in Yersinia pestis.

    Science.gov (United States)

    Bochkareva, Olga O; Dranenko, Natalia O; Ocheredko, Elena S; Kanevsky, German M; Lozinsky, Yaroslav N; Khalaycheva, Vera A; Artamonova, Irena I; Gelfand, Mikhail S

    2018-01-01

    Genome rearrangements have played an important role in the evolution of Yersinia pestis from its progenitor Yersinia pseudotuberculosis . Traditional phylogenetic trees for Y. pestis based on sequence comparison have short internal branches and low bootstrap supports as only a small number of nucleotide substitutions have occurred. On the other hand, even a small number of genome rearrangements may resolve topological ambiguities in a phylogenetic tree. We reconstructed phylogenetic trees based on genome rearrangements using several popular approaches such as Maximum likelihood for Gene Order and the Bayesian model of genome rearrangements by inversions. We also reconciled phylogenetic trees for each of the three CRISPR loci to obtain an integrated scenario of the CRISPR cassette evolution. Analysis of contradictions between the obtained evolutionary trees yielded numerous parallel inversions and gain/loss events. Our data indicate that an integrated analysis of sequence-based and inversion-based trees enhances the resolution of phylogenetic reconstruction. In contrast, reconstructions of strain relationships based on solely CRISPR loci may not be reliable, as the history is obscured by large deletions, obliterating the order of spacer gains. Similarly, numerous parallel gene losses preclude reconstruction of phylogeny based on gene content.

  14. Bayesian networks with examples in R

    CERN Document Server

    Scutari, Marco

    2014-01-01

    Introduction. The Discrete Case: Multinomial Bayesian Networks. The Continuous Case: Gaussian Bayesian Networks. More Complex Cases. Theory and Algorithms for Bayesian Networks. Real-World Applications of Bayesian Networks. Appendices. Bibliography.

  15. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  16. Bayesian approach to inverse statistical mechanics

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  17. Tectonic History and Deep Structure of the Demerara Plateau from Combined Wide-Angle and Reflection Seismic Data and Plate Kinematic Reconstructions

    Science.gov (United States)

    Klingelhoefer, F.; Museur, T.; Roest, W. R.; Graindorge, D.; Chauvet, F.; Loncke, L.; Basile, C.; Poetisi, E.; Deverchere, J.; Lebrun, J. F.; Perrot, J.; Heuret, A.

    2017-12-01

    Many transform margins have associated intermediate depth marginal plateaus, which are commonly located between two oceanic basins. The Demerara plateau is located offshore Surinam and French Guiana. Plate kinematic reconstructions show that the plateau is located between the central and equatorial Atlantic in a position conjugate to the Guinean Plateau. In the fall of 2016, the MARGATS cruise acquired geophysical data along the 400 km wide Demerara plateau. The main objective of the cruise was to image the deep structure of the Demerara plateau and to study its tectonic history. A set of 4 combined wide-angle and reflection seismic profiles was acquired along the plateau, using 80 ocean-bottom seismometers, a 3 km long seismic streamer and a 8000 cu inch tuned airgun array. Forward modelling of the wide-angle seismic data on a profile, located in the eastern part of the plateau and oriented in a NE-SW direction, images the crustal structure of the plateau, the transition zone and the neighbouring crust of oceanic origin, up to a depth of 40 km. The plateau itself is characterised by a crust of 30 km thickness, subdivided into three distinct layers. However, the velocities and velocity gradients do not fit typical continental crust, with a lower crustal layer showing untypically high velocities and an upper layer having a steep velocity gradient. From this model we propose that the lowermost layer is probably formed from volcanic underplated material and that the upper crustal layer likely consists of the corresponding extrusive volcanic material, forming thick seaward-dipping reflector sequences on the plateau. A basement high is imaged at the foot of the slope and forms the ocean-continent transition zone. Further oceanward, a 5-6 km thick crust is imaged with velocities and velocity gradients corresponding to a thin oceanic crust. A compilation of magnetic data from the MARGATS and 3 previous cruises shows a high amplitude magnetic anomaly along the northern

  18. Reconstructing Southern Greenland Ice Sheet History During the Plio-Pleistocene Intensification of Northern Hemisphere Glaciation: Insights from IODP Site U1307

    Science.gov (United States)

    Blake-Mizen, K. R.; Hatfield, R. G.; Carlson, A. E.; Walczak, M. H.; Stoner, J. S.; Xuan, C.; Lawrence, K. T.; Bailey, I.

    2017-12-01

    Should it melt entirely, the Greenland Ice Sheet (GrIS) has the potential to raise global sea-level by 7 metres. With the Arctic continuing to warm at a remarkable rate, to better understand how the GrIS will respond to future anthropogenically-induced climate change we must constrain its natural variability in the geological past. In this regard, much uncertainty exists surrounding its pre-Quaternary history; particularly during the mid-Piacenzian warm period (mPWP; 3.3-3.0 Ma) - widely considered an analogue for near-future equilibrium climate with modern atmospheric CO2 levels and elevated temperatures relative to today - and the late Pliocene/early Pleistocene onset of widespread Northern Hemisphere glaciation (NHG, 2.7 Ma). GrIS reconstructions for these intervals have been largely hampered by a lack of well-dated, high-resolution records from suitable sites. To address this, we present new high-resolution, multi-proxy records from IODP Site U1307, a North Atlantic marine sediment core recovered from the Eirik Drift just south of Greenland. Generation of a new high-resolution relative palaeointensity (RPI)-based age-model - representing the first of its kind for high-latitude sediments deposited during NHG - has enabled strong orbital age control. Our ice-rafted debris (IRD) record confirms a 2.72 Ma initiation of major southern GrIS marine-terminating glaciations, which appear to persist even through interglacial periods up to at least 2.24 Ma. XRF-scanning and IRD evidence suggests, however, that an ephemeral ice-cap of likely considerable size persisted on southern Greenland prior to the mPWP. These data, together with the analysed provenance of individual IRD, indicate marine-based GrIS margins extended southward over the NHG interval and only occurred on Greenland's southern tip from 2.7 Ma. Despite a large increase in the deposition of GrIS-derived IRD from this time, bulk sedimentation rates and magnetic grain-size dropped significantly, implying that

  19. Approximate Bayesian evaluations of measurement uncertainty

    Science.gov (United States)

    Possolo, Antonio; Bodnar, Olha

    2018-04-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) includes formulas that produce an estimate of a scalar output quantity that is a function of several input quantities, and an approximate evaluation of the associated standard uncertainty. This contribution presents approximate, Bayesian counterparts of those formulas for the case where the output quantity is a parameter of the joint probability distribution of the input quantities, also taking into account any information about the value of the output quantity available prior to measurement expressed in the form of a probability distribution on the set of possible values for the measurand. The approximate Bayesian estimates and uncertainty evaluations that we present have a long history and illustrious pedigree, and provide sufficiently accurate approximations in many applications, yet are very easy to implement in practice. Differently from exact Bayesian estimates, which involve either (analytical or numerical) integrations, or Markov Chain Monte Carlo sampling, the approximations that we describe involve only numerical optimization and simple algebra. Therefore, they make Bayesian methods widely accessible to metrologists. We illustrate the application of the proposed techniques in several instances of measurement: isotopic ratio of silver in a commercial silver nitrate; odds of cryptosporidiosis in AIDS patients; height of a manometer column; mass fraction of chromium in a reference material; and potential-difference in a Zener voltage standard.

  20. Bayesian policy reuse

    CSIR Research Space (South Africa)

    Rosman, Benjamin

    2016-02-01

    Full Text Available Keywords Policy Reuse · Reinforcement Learning · Online Learning · Online Bandits · Transfer Learning · Bayesian Optimisation · Bayesian Decision Theory. 1 Introduction As robots and software agents are becoming more ubiquitous in many applications.... The agent has access to a library of policies (pi1, pi2 and pi3), and has previously experienced a set of task instances (τ1, τ2, τ3, τ4), as well as samples of the utilities of the library policies on these instances (the black dots indicate the means...

  1. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  2. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  3. Bayesian Dark Knowledge

    NARCIS (Netherlands)

    Korattikara, A.; Rathod, V.; Murphy, K.; Welling, M.; Cortes, C.; Lawrence, N.D.; Lee, D.D.; Sugiyama, M.; Garnett, R.

    2015-01-01

    We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple

  4. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...

  5. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  6. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which r...

  7. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  8. Bayesian Exponential Smoothing.

    OpenAIRE

    Forbes, C.S.; Snyder, R.D.; Shami, R.S.

    2000-01-01

    In this paper, a Bayesian version of the exponential smoothing method of forecasting is proposed. The approach is based on a state space model containing only a single source of error for each time interval. This model allows us to improve current practices surrounding exponential smoothing by providing both point predictions and measures of the uncertainty surrounding them.

  9. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  10. Bayesian network modelling of upper gastrointestinal bleeding

    Science.gov (United States)

    Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri

    2013-09-01

    Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.

  11. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  12. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....

  13. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  14. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....

  15. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  16. Introduction to Bayesian statistics

    CERN Document Server

    Koch, Karl-Rudolf

    2007-01-01

    This book presents Bayes' theorem, the estimation of unknown parameters, the determination of confidence regions and the derivation of tests of hypotheses for the unknown parameters. It does so in a simple manner that is easy to comprehend. The book compares traditional and Bayesian methods with the rules of probability presented in a logical way allowing an intuitive understanding of random variables and their probability distributions to be formed.

  17. Bayesian ARTMAP for regression.

    Science.gov (United States)

    Sasu, L M; Andonie, R

    2013-10-01

    Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  19. River history.

    Science.gov (United States)

    Vita-Finzi, Claudio

    2012-05-13

    During the last half century, advances in geomorphology-abetted by conceptual and technical developments in geophysics, geochemistry, remote sensing, geodesy, computing and ecology-have enhanced the potential value of fluvial history for reconstructing erosional and depositional sequences on the Earth and on Mars and for evaluating climatic and tectonic changes, the impact of fluvial processes on human settlement and health, and the problems faced in managing unstable fluvial systems. This journal is © 2012 The Royal Society

  20. Contrasting population-level responses to Pleistocene climatic oscillations in an alpine bat revealed by complete mitochondrial genomes and evolutionary history inference

    DEFF Research Database (Denmark)

    Alberdi, Antton; Gilbert, M. Thomas P; Razgour, Orly

    2015-01-01

    Aim: We used an integrative approach to reconstruct the evolutionary history of the alpine long-eared bat, Plecotus macrobullaris, to test whether the variable effects of Pleistocene climatic oscillations across geographical regions led to contrasting population-level demographic histories within...... a single species. Location: The Western Palaearctic. Methods: We sequenced the complete mitochondrial genomes of 57 individuals from across the distribution of the species. The analysis integrated ecological niche modelling (ENM), approximate Bayesian computation (ABC), measures of genetic diversity...... and Bayesian phylogenetic methods. Results: We identified two deep lineages: a western lineage, restricted to the Pyrenees and the Alps, and an eastern lineage, which expanded across the mountain ranges east of the Dinarides (Croatia). ENM projections of past conditions predicted that climatic suitability...

  1. Comparing and improving reconstruction methods for proxies based on compositional data

    Science.gov (United States)

    Nolan, C.; Tipton, J.; Booth, R.; Jackson, S. T.; Hooten, M.

    2017-12-01

    Many types of studies in paleoclimatology and paleoecology involve compositional data. Often, these studies aim to use compositional data to reconstruct an environmental variable of interest; the reconstruction is usually done via the development of a transfer function. Transfer functions have been developed using many different methods. Existing methods tend to relate the compositional data and the reconstruction target in very simple ways. Additionally, the results from different methods are rarely compared. Here we seek to address these two issues. First, we introduce a new hierarchical Bayesian multivariate gaussian process model; this model allows for the relationship between each species in the compositional dataset and the environmental variable to be modeled in a way that captures the underlying complexities. Then, we compare this new method to machine learning techniques and commonly used existing methods. The comparisons are based on reconstructing the water table depth history of Caribou Bog (an ombrotrophic Sphagnum peat bog in Old Town, Maine, USA) from a new 7500 year long record of testate amoebae assemblages. The resulting reconstructions from different methods diverge in both their resulting means and uncertainties. In particular, uncertainty tends to be drastically underestimated by some common methods. These results will help to improve inference of water table depth from testate amoebae. Furthermore, this approach can be applied to test and improve inferences of past environmental conditions from a broad array of paleo-proxies based on compositional data

  2. Regional variation in otolith Sr:Ca ratios of African longfinned eel Anguilla mossambica and mottled eel Anguilla marmorata: a challenge to the classic tool for reconstructing migratory histories of fishes.

    Science.gov (United States)

    Lin, Y-J; Jessop, B M; Weyl, O L F; Iizuka, Y; Lin, S-H; Tzeng, W-N; Sun, C-L

    2012-07-01

    Otolith Sr:Ca ratios of the African longfinned eel Anguilla mossambica and giant mottled eel Anguilla marmorata from nine freshwater sites in four rivers of South Africa were analysed to reconstruct their migratory life histories between freshwater and saltwater habitats. For A. mossambica, the Sr:Ca ratios in the otolith edge differed significantly among rivers and had large effect sizes, but did not differ among sites within a river. Otolith Sr:Ca ratios did not differ among rivers for A. marmorata. When rivers were pooled, the edge Sr:Ca ratios of A. mossambica were not significantly different from those of A. marmorata. According to the river-specific critical Sr:Ca ratio distinguishing freshwater from saltwater residence, most A. mossambica and A. marmorata had saltwater habitat experience after settlement in fresh water. This was primarily during their elver stage or early in the yellow eel stage. During the middle and late yellow eel stage, freshwater residency was preferred and only sporadic visits were made to saltwater habitats. The data also suggest that regional variations in otolith Sr:Ca ratios affect the critical Sr:Ca value and are a challenge for the reconstruction of migratory life histories that should be explicitly considered to avoid bias and uncertainty. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  3. Using Bayesian Belief Networks and event trees for volcanic hazard assessment and decision support : reconstruction of past eruptions of La Soufrière volcano, Guadeloupe and retrospective analysis of 1975-77 unrest.

    Science.gov (United States)

    Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges

    2013-04-01

    Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to

  4. From the history of the recognitions of the remains to the reconstruction of the face of Dante Alighieri by means of techniques of virtual reality and forensic anthropology

    Directory of Open Access Journals (Sweden)

    Stefano Benazzi

    2007-07-01

    Full Text Available The work consists of the reconstruction of the face of the great poet called Dante Alighieri through a multidisciplinary approach that matches traditional techniques (manual ones, usually used in forensic anthropology, with digital methodologies that take advantage of technologies born in manufacturer-military fields but that are more and more often applied to the field of the cultural heritage. Unable to get the original skull of Dante, the work started from the data and the elements collected by Fabio Frassetto and Giuseppe Sergi, two important anthropologists, respectively at the University of Bologna and Rome, in an investigation carried out in 1921, sixth century anniversary of his death, on the remains of the poet collected in Ravenna. Thanks to this, we have a very accurate description of Dante’s bones, including 297 metric data inherent to the whole skeleton, some photographs in the scale of the skull, the various norms and many other bones, as well as a model of the skull subsequently realized by Frassetto. According to these information, a geometric reconstruction of Dante Alighieri skull including the jaw was carried out through the employment and integration of the instruments and technologies of the virtual reality, and from this the relative physical model through fast prototype was realized. An important aspect of the work regards in a particular manner the methodology of 3D modelling proposed for the new reconstruction of the jaw (not found in the course of the 1921 recognition, starting from a reference model. The model of the skull prototype is then useful as the basis for the successive stage of facial reconstruction through the traditional techniques of forensic art.

  5. Bayesian maximum posterior probability method for interpreting plutonium urinalysis data

    International Nuclear Information System (INIS)

    Miller, G.; Inkret, W.C.

    1996-01-01

    A new internal dosimetry code for interpreting urinalysis data in terms of radionuclide intakes is described for the case of plutonium. The mathematical method is to maximise the Bayesian posterior probability using an entropy function as the prior probability distribution. A software package (MEMSYS) developed for image reconstruction is used. Some advantages of the new code are that it ensures positive calculated dose, it smooths out fluctuating data, and it provides an estimate of the propagated uncertainty in the calculated doses. (author)

  6. Bayesian Reasoning Using 3D Relations for Lane Marker Detection

    DEFF Research Database (Denmark)

    Boesman, Bart; Jensen, Lars Baunegaard With; Baseski, Emre

    2009-01-01

    We introduce a lane marker detection algorithm that integrates 3D attributes as well as 3D relations between local edges and semi-global contours in a Bayesian framework. The algorithm is parameter free and does not make use of any heuristic assumptions. The reasoning is based on the complete...... to the reconstruction process need to be taken into account to make the reasoning process more stable. The results are shown on a publicly available data set....

  7. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  8. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  9. Reconstructing the history of the Atlantic Multidecadal Oscillation using high-resolution Mg/Ca paleothermometry from a Cariaco Basin core

    Science.gov (United States)

    Wurtzel, J. B.; Black, D. E.; Rahman, S.; Thunell, R.; Peterson, L. C.; Tappa, E.

    2010-12-01

    Instrumental and proxy-reconstructions show the existence of an approximately 70-year periodicity in Atlantic sea surface temperature (SST), known as the Atlantic Multidecadal Oscillation (AMO). The AMO is correlated with circum-tropical Atlantic climate phenomena such as Sahel and Nordeste rainfall, and Atlantic hurricane patterns. Though it has been suggested that the AMO is controlled by thermohaline circulation, much debate exists as to whether the SST fluctuations are a result of anthropogenic forcing or a natural climate mode, or even if the AMO is a true oscillation at all. Our ability to address this issue has been limited by instrumental SST records that rarely extend back more than 50-100 years and proxy reconstructions that are mostly terrestrial-based. Additionally, the modern instrumental variability likely contains an anthropogenic component that is not easily distinguished from the natural background of the system. From a marine sediment core taken in the Cariaco Basin, we have developed a high-resolution SST reconstruction for the past ca. 1500 years using Mg/Ca paleothermometry on seasonally-representative foraminifera, with the most recent data calibrated to the instrumental record. Previous studies have shown Cariaco Basin Mg/Ca-SSTs to be well-correlated to the Caribbean Sea and much of the western tropical Atlantic, which allows us to create a record that can be used to determine pre-anthropogenic rates and ranges of SST variability and observe how they change over time. Averaging the seasonal temperatures derived from the two foraminiferal species over the instrumental period yields a strong correlation to the AMO index from A. D. 1880 through 1970 (r = 0.44, p<0.0001). Wavelet analysis of the proxy average annual SST data indicates that modern AMO variability is not a consistent feature through time, and may be a function of warm-period climate.

  10. Mesozoic–Cenozoic Climate and Neotectonic Events as Factors in Reconstructing the Thermal History of the Source-Rock Bazhenov Formation, Arctic Region, West Siberia, by the Example of the Yamal Peninsula

    Science.gov (United States)

    Isaev, V. I.; Iskorkina, A. A.; Lobova, G. A.; Starostenko, V. I.; Tikhotskii, S. A.; Fomin, A. N.

    2018-03-01

    Schemes and criteria are developed for using the measured and modeled geotemperatures for studying the thermal regime of the source rock formations, as well as the tectonic and sedimentary history of sedimentary basins, by the example of the oil fields of the Yamal Peninsula. The method of paleotemperature modeling based on the numerical solution of the heat conduction equation for a horizontally layered solid with a movable upper boundary is used. The mathematical model directly includes the climatic secular trend of the Earth's surface temperature as the boundary condition and the paleotemperatures determined from the vitrinite reflectance as the measurement data. The method does not require a priori information about the nature and intensities of the heat flow from the Earth's interior; the flow is determined by solving the inverse problem of geothermy with a parametric description of the of the sedimentation history and the history of the thermophysical properties of the sedimentary stratum. The rate of sedimentation is allowed to be zero and negative which provides the possibility to take into account the gaps in sedimentation and denudation. The formation, existence, and degradation of the permafrost stratum and ice cover are taken into account as dynamical lithological-stratigraphic complexes with anomalously high thermal conductivity. It is established that disregarding the paleoclimatic factors precludes an adequate reconstruction of thermal history of the source-rock deposits. Revealing and taking into account the Late Eocene regression provided the computationally optimal and richest thermal history of the source-rock Bazhenov Formation, which led to more correct volumetric-genetic estimates of the reserves. For estimating the hydrocarbon reserves in the land territories of the Arctic region of West Siberia by the volumetric-genetic technique, it is recommended to use the Arctic secular trend of temperatures and take into account the dynamics of the

  11. Impact of Quaternary climatic changes and interspecific competition on the demographic history of a highly mobile generalist carnivore, the coyote.

    Science.gov (United States)

    Koblmüller, Stephan; Wayne, Robert K; Leonard, Jennifer A

    2012-08-23

    Recurrent cycles of climatic change during the Quaternary period have dramatically affected the population genetic structure of many species. We reconstruct the recent demographic history of the coyote (Canis latrans) through the use of Bayesian techniques to examine the effects of Late Quaternary climatic perturbations on the genetic structure of a highly mobile generalist species. Our analysis reveals a lack of phylogeographic structure throughout the range but past population size changes correlated with climatic changes. We conclude that even generalist carnivorous species are very susceptible to environmental changes associated with climatic perturbations. This effect may be enhanced in coyotes by interspecific competition with larger carnivores.

  12. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  13. Searching Algorithm Using Bayesian Updates

    Science.gov (United States)

    Caudle, Kyle

    2010-01-01

    In late October 1967, the USS Scorpion was lost at sea, somewhere between the Azores and Norfolk Virginia. Dr. Craven of the U.S. Navy's Special Projects Division is credited with using Bayesian Search Theory to locate the submarine. Bayesian Search Theory is a straightforward and interesting application of Bayes' theorem which involves searching…

  14. Bayesian Data Analysis (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  15. Bayesian Data Analysis (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  16. Profile reconstruction from neutron reflectivity data and a priori knowledge

    International Nuclear Information System (INIS)

    Leeb, H.

    2008-01-01

    The problem of incomplete and noisy information in profile reconstruction from neutron reflectometry data is considered. In particular methods of Bayesian statistics in combination with modelling or inverse scattering techniques are considered in order to properly include the required a priori knowledge to obtain quantitatively reliable estimates of the reconstructed profiles. Applying Bayes theorem the results of different experiments on the same sample can be consistently included in the profile reconstruction

  17. The Bayesian Covariance Lasso.

    Science.gov (United States)

    Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G

    2013-04-01

    Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.

  18. Bayesian dynamic mediation analysis.

    Science.gov (United States)

    Huang, Jing; Yuan, Ying

    2017-12-01

    Most existing methods for mediation analysis assume that mediation is a stationary, time-invariant process, which overlooks the inherently dynamic nature of many human psychological processes and behavioral activities. In this article, we consider mediation as a dynamic process that continuously changes over time. We propose Bayesian multilevel time-varying coefficient models to describe and estimate such dynamic mediation effects. By taking the nonparametric penalized spline approach, the proposed method is flexible and able to accommodate any shape of the relationship between time and mediation effects. Simulation studies show that the proposed method works well and faithfully reflects the true nature of the mediation process. By modeling mediation effect nonparametrically as a continuous function of time, our method provides a valuable tool to help researchers obtain a more complete understanding of the dynamic nature of the mediation process underlying psychological and behavioral phenomena. We also briefly discuss an alternative approach of using dynamic autoregressive mediation model to estimate the dynamic mediation effect. The computer code is provided to implement the proposed Bayesian dynamic mediation analysis. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  20. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  1. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  2. Vertex Reconstruction for AEGIS’ FACT Detector

    CERN Document Server

    Themistokleous, Neofytos

    2017-01-01

    My project dealt with the development of a vertex reconstruction technique to discriminate antihydrogen from background signals in the AEGIS apparatus. It involved the creation of a Toy Monte-Carlo to simulate particle annihilation events, and a vertex reconstruction utility based on the Bayesian theory of probability. The first results based on 107 generated events with single track in the detector are encouraging. For such events, the algorithm can reconstruct the z-coordinate accurately , while for the r-coordinate the result is less accurate.

  3. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science

  4. Bayesian analysis of ion beam diagnostics

    International Nuclear Information System (INIS)

    Toussaint, U. von; Fischer, R.; Dose, V.

    2001-01-01

    Ion beam diagnostics are routinely used for quantitative analysis of the surface composition of mixture materials up to a depth of a few μm. Unfortunately, advantageous properties of the diagnostics, like high depth resolution in combination with a large penetration depth, no destruction of the surface, high sensitivity for large as well as for small atomic numbers, and high sensitivity are mutually exclusive. Among other things, this is due to the ill-conditioned inverse problem of reconstructing depth distributions of the composition elements. Robust results for depth distributions are obtained with adaptive methods in the framework of Bayesian probability theory. The method of adaptive kernels allows for distributions which contain only the significant information of the data while noise fitting is avoided. This is achieved by adaptively reducing the degrees of freedom supporting the distribution. As applications for ion beam diagnostics Rutherford backscattering spectroscopy and particle induced X-ray emission are shown

  5. Bayesian Monte Carlo method

    International Nuclear Information System (INIS)

    Rajabalinejad, M.

    2010-01-01

    To reduce cost of Monte Carlo (MC) simulations for time-consuming processes, Bayesian Monte Carlo (BMC) is introduced in this paper. The BMC method reduces number of realizations in MC according to the desired accuracy level. BMC also provides a possibility of considering more priors. In other words, different priors can be integrated into one model by using BMC to further reduce cost of simulations. This study suggests speeding up the simulation process by considering the logical dependence of neighboring points as prior information. This information is used in the BMC method to produce a predictive tool through the simulation process. The general methodology and algorithm of BMC method are presented in this paper. The BMC method is applied to the simplified break water model as well as the finite element model of 17th Street Canal in New Orleans, and the results are compared with the MC and Dynamic Bounds methods.

  6. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  7. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  8. Analysis of trace elements in the shells of short-necked clam Ruditapes philippinarum (Mollusca: Bivalvia) with respect to reconstruction of individual life history

    International Nuclear Information System (INIS)

    Arakawa, Jumpei; Sakamoto, Wataru

    1998-01-01

    Strontium (Sr) concentration in the shells of short-necked clams collected at different locations (Shirahama, warm area and Maizuru, cold area, Japan) was analyzed by two methods, PIXE and EPMA. The Sr concentration of external surface of shell umbo, which was made during short term at early benthic phase, was analyzed by PIXE, and was ranged from 1000 to 3500 ppm for individuals. The Sr concentration of clams collected at Shirahama showed positive correlation with shell length (SL) in individuals with SL < 31 mm, whereas clams collected at Maizuru did not show significant correlation. This result may be caused from the difference of the spawning seasons between two areas. The Sr concentration of cross section of shell umbo, which develops thicker continuously during their life to form faint stratum structure, was analyzed by EPMA along the line across the stratum structure. Some surges and long term waving patterns of the Sr concentration were observed. These results suggest that the life histories of individual clams could be recorded in the shell umbo cross sections as variations of trace elements and analyses of trace elements could clarify the histories of individual clams. (author)

  9. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries......This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  10. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  11. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Science.gov (United States)

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  12. Bayesian image restoration, using configurations

    OpenAIRE

    Thorarinsdottir, Thordis

    2006-01-01

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the re...

  13. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  14. Reconstruction of the Transmission History of RNA Virus Outbreaks Using Full Genome Sequences: Foot-and-Mouth Disease Virus in Bulgaria in 2011

    DEFF Research Database (Denmark)

    Valdazo-González, Begoña; Polihronova, Lilyana; Alexandrov, Tsviatko

    2012-01-01

    the origin and transmission history of the FMD outbreaks which occurred during 2011 in Burgas Province, Bulgaria, a country that had been previously FMD-free-without-vaccination since 1996. Nineteen full genome sequences (FGS) of FMD virus (FMDV) were generated and analysed, including eight representative...... identified in wild boar. The closest relative from outside of Bulgaria was a FMDV collected during 2010 in Bursa (Anatolia, Turkey). Within Bulgaria, two discrete genetic clusters were detected that corresponded to two episodes of outbreaks that occurred during January and March-April 2011. The number...... of nucleotide substitutions that were present between, and within, these separate clusters provided evidence that undetected FMDV infection had occurred. These conclusions are supported by laboratory data that subsequently identified three additional FMDV-infected livestock premises by serosurveillance, as well...

  15. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne

    2012-01-01

    We evaluated the association between radiation therapy and severe capsular contracture or reoperation after 717 delayed breast implant reconstruction procedures (288 1- and 429 2-stage procedures) identified in the prospective database of the Danish Registry for Plastic Surgery of the Breast during...... of radiation therapy was associated with a non-significantly increased risk of reoperation after both 1-stage (HR = 1.4; 95% CI: 0.7-2.5) and 2-stage (HR = 1.6; 95% CI: 0.9-3.1) procedures. Reconstruction failure was highest (13.2%) in the 2-stage procedures with a history of radiation therapy. Breast...... reconstruction approaches other than implants should be seriously considered among women who have received radiation therapy....

  16. The Examination of Patient-Reported Outcomes and Postural Control Measures in Patients With and Without a History of ACL Reconstruction: A Case Control Study.

    Science.gov (United States)

    Hoch, Johanna M; Sinnott, Cori W; Robinson, Kendall P; Perkins, William O; Hartman, Jonathan W

    2018-03-01

    There is a lack of literature to support the diagnostic accuracy and cut-off scores of commonly used patient-reported outcome measures (PROMs) and clinician-oriented outcomes such as postural-control assessments (PCAs) when treating post-ACL reconstruction (ACLR) patients. These scores could help tailor treatments, enhance patient-centered care and may identify individuals in need of additional rehabilitation. To determine if differences in 4-PROMs and 3-PCAs exist between post-ACLR and healthy participants, and to determine the diagnostic accuracy and cut-off scores of these outcomes. Case control. Laboratory. A total of 20 post-ACLR and 40 healthy control participants. The participants completed 4-PROMs (the Disablement in the Physically Active Scale [DPA], The Fear-Avoidance Belief Questionnaire [FABQ], the Knee Osteoarthritis Outcomes Score [KOOS] subscales, and the Tampa Scale of Kinesiophobia [TSK-11]) and 3-PCAs (the Balance Error Scoring System [BESS], the modified Star Excursion Balance Test [SEBT], and static balance on an instrumented force plate). Mann-Whitney U tests examined differences between groups. Receiver operating characteristic (ROC) curves were employed to determine sensitivity and specificity. The Area Under the Curve (AUC) was calculated to determine the diagnostic accuracy of each instrument. The Youdin Index was used to determine cut-off scores. Alpha was set a priori at P < 0.05. There were significant differences between groups for all PROMs (P < 0.05). There were no differences in PCAs between groups. The cut-off scores should be interpreted with caution for some instruments, as the scores may not be clinically applicable. Post-ACLR participants have decreased self-reported function and health-related quality of life. The PROMs are capable of discriminating between groups. Clinicians should consider using the cut-off scores in clinical practice. Further use of the instruments to examine detriments after completion of standard

  17. History of human activity in last 800 years reconstructed from combined archive data and high-resolution analyses of varved lake sediments from Lake Czechowskie, Northern Poland

    Science.gov (United States)

    Słowiński, Michał; Tyszkowski, Sebastian; Ott, Florian; Obremska, Milena; Kaczmarek, Halina; Theuerkauf, Martin; Wulf, Sabine; Brauer, Achim

    2016-04-01

    The aim of the study was to reconstruct human and landscape development in the Tuchola Pinewoods (Northern Poland) during the last 800 years. We apply an approach that combines historic maps and documents with pollen data. Pollen data were obtained from varved lake sediments at a resolution of 5 years. The chronology of the sediment record is based on varve counting, AMS 14C dating, 137Cs activity concentration measurements and tephrochronology (Askja AD 1875). We applied the REVEALS model to translate pollen percentage data into regional plant abundances. The interpretation of the pollen record is furthermore based on pollen accumulation rate data. The pollen record and historic documents show similar trends in vegetation development. During the first phase (AD 1200-1412), the Lake Czechowskie area was still largely forested with Quercus, Carpinus and Pinus forests. Vegetation was more open during the second phase (AD 1412-1776), and reached maximum openness during the third phase (AD 1776-1905). Furthermore, intensified forest management led to a transformation from mixed to pine dominated forests during this period. Since the early 20th century, the forest cover increased again with dominance of the Scots pine in the stand. While pollen and historic data show similar trends, they differ substantially in the degree of openness during the four phases with pollen data commonly suggesting more open conditions. We discuss potential causes for this discrepancy, which include unsuitable parameters settings in REVEALS and unknown changes in forest structure. Using pollen accumulation data as a third proxy record we aim to identify the most probable causes. Finally, we discuss the observed vegetation change in relation the socio-economic development of the area. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution Analysis - ICLEA- of the Helmholtz Association and National Science Centre, Poland (grant No. 2011/01/B/ST10

  18. Bayesian error analysis model for reconstructing transcriptional regulatory networks

    OpenAIRE

    Sun, Ning; Carroll, Raymond J.; Zhao, Hongyu

    2006-01-01

    Transcription regulation is a fundamental biological process, and extensive efforts have been made to dissect its mechanisms through direct biological experiments and regulation modeling based on physical–chemical principles and mathematical formulations. Despite these efforts, transcription regulation is yet not well understood because of its complexity and limitations in biological experiments. Recent advances in high throughput technologies have provided substantial amounts and diverse typ...

  19. ICDP Project DeepCHALLA: Reconstructing 250,000 Years of Climate Change and Environmental History on the East African Equator

    Science.gov (United States)

    Wolff, C.; Verschuren, D.; Van Daele, M. E.; Waldmann, N.; Meyer, I.; Lane, C. S.; Van der Meeren, T.; Ombori, T.; Kasanzu, C.; Olago, D.

    2017-12-01

    Sediments on the bottom of Lake Challa, a 92-m deep crater lake on the border of Kenya and Tanzania near Mt. Kilimanjaro, contain a uniquely long and continuous record of past climate and environmental change in easternmost equatorial Africa. Supported in part by the International Continental Scientific Drilling Programme (ICDP), the DeepCHALLA project has now recovered this sediment record down to 214.8 m below the lake floor, with 100% recovery of the uppermost 121.3 m (the last 160 kyr BP) and ca.85% recovery of the older part of the sequence, down to the lowermost distinct reflector identified in seismic stratigraphy. This acoustic basement represents a ca.2-m thick layer of coarsely laminated, diatom-rich organic mud mixed with volcanic sand and silt deposited 250 kyr ago, overlying an estimated 20-30 m of unsampled lacustrine deposits representing the earliest phase of lake development. Down-hole logging produced profiles of in-situ sediment composition that confer an absolute depth- scale to both the recovered cores and the seismic stratigraphy. An estimated 74% of the recovered sequence is finely laminated (varved), and continuously so over the upper 72.3 m (the last 90 kyr). All other sections display at least cm-scale lamination, demonstrating persistence of a tranquil, profundal depositional environment throughout lake history. The sequence is interrupted only by 32 visible tephra layers 2 to 9 mm thick; and by several dozen fine-grained turbidites up to 108 cm thick, most of which are clearly bracketed between a non-erosive base and a diatom-laden cap. Tie points between sediment markers and the corresponding seismic reflectors support a preliminary age model inferring a near-constant rate of sediment accumulation over at least the last glacial cycle (140 kyr BP to present). This great time span combined with the exquisite temporal resolution of the Lake Challa sediments provides great opportunities to study past tropical climate dynamics at both short

  20. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  1. Sparse Bayesian Learning for DOA Estimation with Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Jisheng Dai

    2015-10-01

    Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.

  2. Precipitation history of the central Atacama Desert since the Miocene as reconstructed from clay pan records of the Costal Cordillera/ N Chile

    Science.gov (United States)

    Wennrich, V.; Melles, M.; Diederich, J. L.; Fernández Galego, E.; Ritter, B.; Brill, D.; Niemann, K.; Rolf, C.; Dunai, T. J.

    2017-12-01

    Hyperaridity is a major limitation of Earth-surface processes and biological activity in the Atacama Desert of N Chile, one of the oldest and the driest deserts on Earth. But even the hyperarid core of the Atacama Desert of N Chile has experienced sever precipitation events, e.g., during the flash floods in 2015. On geological timescales, the overall aridity that is postulated to have lasted at least since the early Miocene was punctuated by distinct pluvial events. Such wetter conditions, e.g. during the Miocene, caused widespread lake-formation in the Central Depression and Coastal Cordillera, but also caused amplified surface processes, changes in vegetation dynamics, and enabled the dispersal of species. Unfortunately, due to the limited number and heterogeneous appearance of climate archives from the central Atacama, it's longer-scale precipitation history is still a matter of controversy. This study aims to study continuous longterm (Pleistocene-Miocene) paleoclimatic and environmental records from the hyperarid core of the Atacama Desert covering the last >10 Ma. Therefor we investigate clay pans records from endorheic basins in the Coastal Cordillera mostly formed by blocking of drainage by tectonic movement. The clay pans under study are located along a latitudinal transect across the hyperarid core of the Atacama, and thus, are assumed to have recorded local and regional precipitation variations on different timescales. The investigated sequences exhibit significant changes in the sedimentological, geochemical, and mineralogical properties due to changes in precipitation, but also in the weathering and erosion in the catchments. Diatom and phytolith remains preserved in these records clearly point to significant water bodies during the wettest periods and a significant vegetation cover. The results shed a new light on the timing, frequency, and the driving mechanisms of the intervening pluvial phases.

  3. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  4. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  5. Iterative image reconstruction in ECT

    International Nuclear Information System (INIS)

    Chintu Chen; Ordonez, C.E.; Wernick, M.N.; Aarsvold, J.N.; Gunter, D.L.; Wong, W.H.; Kapp, O.H.; Xiaolong Ouyang; Levenson, M.; Metz, C.E.

    1992-01-01

    A series of preliminary studies has been performed in the authors laboratories to explore the use of a priori information in Bayesian image restoration and reconstruction. One piece of a priori information is the fact that intensities of neighboring pixels tend to be similar if they belong to the same region within which similar tissue characteristics are exhibited. this property of local continuity can be modeled by the use of Gibbs priors, as first suggested by German and Geman. In their investigation, they also included line sites between each pair of neighboring pixels in the Gibbs prior and used discrete binary numbers to indicate the absence or presence of boundaries between regions. These two features of the a priori model permit averaging within boundaries of homogeneous regions to alleviate the degradation caused by Poisson noise. with the use of this Gibbs prior in combination with the technique of stochastic relaxation, German and Geman demonstrated that noise levels can be reduced significantly in 2-D image restoration. They have developed a Bayesian method that utilizes a Gibbs prior to describe the spatial correlation of neighboring regions and takes into account the effect of limited spatial resolution as well. The statistical framework of the proposed approach is based on the data augmentation scheme suggested by Tanner and Wong. Briefly outlined here, this Bayesian method is based on Geman and Geman's approach

  6. Bayesian networks improve causal environmental ...

    Science.gov (United States)

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value

  7. Bayesian Latent Class Analysis Tutorial.

    Science.gov (United States)

    Li, Yuelin; Lord-Bessen, Jennifer; Shiyko, Mariya; Loeb, Rebecca

    2018-01-01

    This article is a how-to guide on Bayesian computation using Gibbs sampling, demonstrated in the context of Latent Class Analysis (LCA). It is written for students in quantitative psychology or related fields who have a working knowledge of Bayes Theorem and conditional probability and have experience in writing computer programs in the statistical language R . The overall goals are to provide an accessible and self-contained tutorial, along with a practical computation tool. We begin with how Bayesian computation is typically described in academic articles. Technical difficulties are addressed by a hypothetical, worked-out example. We show how Bayesian computation can be broken down into a series of simpler calculations, which can then be assembled together to complete a computationally more complex model. The details are described much more explicitly than what is typically available in elementary introductions to Bayesian modeling so that readers are not overwhelmed by the mathematics. Moreover, the provided computer program shows how Bayesian LCA can be implemented with relative ease. The computer program is then applied in a large, real-world data set and explained line-by-line. We outline the general steps in how to extend these considerations to other methodological applications. We conclude with suggestions for further readings.

  8. Kernel Bayesian ART and ARTMAP.

    Science.gov (United States)

    Masuyama, Naoki; Loo, Chu Kiong; Dawood, Farhan

    2018-02-01

    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Coupling erosion and topographic development in the rainiest place on Earth: Reconstructing the Shillong Plateau uplift history with in-situ cosmogenic 10Be

    Science.gov (United States)

    Rosenkranz, Ruben; Schildgen, Taylor; Wittmann, Hella; Spiegel, Cornelia

    2018-02-01

    The uplift of the Shillong Plateau, in northeast India between the Bengal floodplain and the Himalaya Mountains, has had a significant impact on regional precipitation patterns, strain partitioning, and the path of the Brahmaputra River. Today, the plateau receives the highest measured yearly rainfall in the world and is tectonically active, having hosted one of the strongest intra-plate earthquakes ever recorded. Despite the unique tectonic and climatic setting of this prominent landscape feature, its exhumation and surface uplift history are poorly constrained. We collected 14 detrital river sand and 3 bedrock samples from the southern margin of the Shillong Plateau to measure erosion rates using the terrestrial cosmogenic nuclide 10Be. The calculated bedrock erosion rates range from 2.0 to 5.6 m My-1, whereas catchment average erosion rates from detrital river sands range from 48 to 214 m My-1. These rates are surprisingly low in the context of steep, tectonically active slopes and extreme rainfall. Moreover, the highest among these rates, which occur on the low-relief plateau surface, appear to have been affected by anthropogenic land-use change. To determine the onset of surface uplift, we coupled the catchment averaged erosion rates with topographic analyses of the plateau's southern margin. We interpolated an inclined, pre-incision surface from minimally eroded remnants along the valley interfluves and calculated the eroded volume of the valleys carved beneath the surface. The missing volume was then divided by the volume flux derived from the erosion rates to obtain the onset of uplift. The results of this calculation, ranging from 3.0 to 5.0 Ma for individual valleys, are in agreement with several lines of stratigraphic evidence from the Brahmaputra and Bengal basin that constrain the onset of topographic uplift, specifically the onset of flexural loading and the transgression from deltaic to marine deposition. Ultimately, our data corroborate the

  10. Bayesian analyses of Yemeni mitochondrial genomes suggest multiple migration events with Africa and Western Eurasia.

    Science.gov (United States)

    Vyas, Deven N; Kitchen, Andrew; Miró-Herrans, Aida T; Pearson, Laurel N; Al-Meeri, Ali; Mulligan, Connie J

    2016-03-01

    Anatomically, modern humans are thought to have migrated out of Africa ∼60,000 years ago in the first successful global dispersal. This initial migration may have passed through Yemen, a region that has experienced multiple migrations events with Africa and Eurasia throughout human history. We use Bayesian phylogenetics to determine how ancient and recent migrations have shaped Yemeni mitogenomic variation. We sequenced 113 mitogenomes from multiple Yemeni regions with a focus on haplogroups M, N, and L3(xM,N) as these groups have the oldest evolutionary history outside of Africa. We performed Bayesian evolutionary analyses to generate time-measured phylogenies calibrated by Neanderthal and Denisovan mitogenomes in order to determine the age of Yemeni-specific clades. As defined by Yemeni monophyly, Yemeni in situ evolution is limited to the Holocene or latest Pleistocene (ages of clades in subhaplogroups L3b1a1a, L3h2, L3x1, M1a1f, M1a5, N1a1a3, and N1a3 range from 2 to 14 kya) and is often situated within broader Horn of Africa/southern Arabia in situ evolution (L3h2, L3x1, M1a1f, M1a5, and N1a1a3 ages range from 7 to 29 kya). Five subhaplogroups show no monophyly and are candidates for Holocene migration into Yemen (L0a2a2a, L3d1a1a, L3i2, M1a1b, and N1b1a). Yemeni mitogenomes are largely the product of Holocene migration, and subsequent in situ evolution, from Africa and western Eurasia. However, we hypothesize that recent population movements may obscure the genetic signature of more ancient migrations. Additional research, e.g., analyses of Yemeni nuclear genetic data, is needed to better reconstruct the complex population and migration histories associated with Out of Africa. © 2015 Wiley Periodicals, Inc.

  11. Climate Reconstructions

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Paleoclimatology Program archives reconstructions of past climatic conditions derived from paleoclimate proxies, in addition to the Program's large holdings...

  12. Interactive Instruction in Bayesian Inference

    DEFF Research Database (Denmark)

    Khan, Azam; Breslav, Simon; Hornbæk, Kasper

    2018-01-01

    An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction. These pri......An instructional approach is presented to improve human performance in solving Bayesian inference problems. Starting from the original text of the classic Mammography Problem, the textual expression is modified and visualizations are added according to Mayer’s principles of instruction....... These principles concern coherence, personalization, signaling, segmenting, multimedia, spatial contiguity, and pretraining. Principles of self-explanation and interactivity are also applied. Four experiments on the Mammography Problem showed that these principles help participants answer the questions...... that an instructional approach to improving human performance in Bayesian inference is a promising direction....

  13. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  14. Bayesian analysis of CCDM models

    Science.gov (United States)

    Jesus, J. F.; Valentim, R.; Andrade-Oliveira, F.

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3αH0 model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  15. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  16. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming; Zhang, Jian

    2009-01-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly

  17. Bayesian Network Induction via Local Neighborhoods

    National Research Council Canada - National Science Library

    Margaritis, Dimitris

    1999-01-01

    .... We present an efficient algorithm for learning Bayesian networks from data. Our approach constructs Bayesian networks by first identifying each node's Markov blankets, then connecting nodes in a consistent way...

  18. Can a significance test be genuinely Bayesian?

    OpenAIRE

    Pereira, Carlos A. de B.; Stern, Julio Michael; Wechsler, Sergio

    2008-01-01

    The Full Bayesian Significance Test, FBST, is extensively reviewed. Its test statistic, a genuine Bayesian measure of evidence, is discussed in detail. Its behavior in some problems of statistical inference like testing for independence in contingency tables is discussed.

  19. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  20. The subjectivity of scientists and the Bayesian approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  1. Bayesian methods for chromosome dosimetry following a criticality accident

    International Nuclear Information System (INIS)

    Brame, R.S.; Groer, P.G.

    2003-01-01

    Radiation doses received during a criticality accident will be from a combination of fission spectrum neutrons and gamma rays. It is desirable to estimate the total dose, as well as the neutron and gamma doses. Present methods for dose estimation with chromosome aberrations after a criticality accident use point estimates of the neutron to gamma dose ratio obtained from personnel dosemeters and/or accident reconstruction calculations. In this paper a Bayesian approach to dose estimation with chromosome aberrations is developed that allows the uncertainty of the dose ratio to be considered. Posterior probability densities for the total and the neutron and gamma doses were derived. (author)

  2. 87Sr/86Sr isotope ratio analysis by laser ablation MC-ICP-MS in scales, spines, and fin rays as a nonlethal alternative to otoliths for reconstructing fish life history

    Science.gov (United States)

    Willmes, Malte; Glessner, Justin J. G.; Carleton, Scott A.; Gerrity, Paul C.; Hobbs, James A.

    2016-01-01

    Strontium isotope ratios (87Sr/86Sr) in otoliths are a well-established tool to determine origins and movement patterns of fish. However, otolith extraction requires sacrificing fish, and when working with protected or endangered species, the use of nonlethal samples such as scales, spines, and fin rays is preferred. Unlike otoliths that are predominantly aragonite, these tissues are composed of biological apatite. Laser ablation multicollector inductively coupled plasma mass spectrometry (LA-MC-ICP-MS) analysis of biological apatite can induce significant interference on mass 87, causing inaccurate 87Sr/86Sr measurements. To quantify this interference, we applied LA-MC-ICP-MS to three marine samples (white seabass (Atractoscion nobilis) otolith; green sturgeon (Acipenser medirostris) pectoral fin ray; salmon shark (Lamna ditropis) tooth), and freshwater walleye (Sander vitreus) otoliths, scales, and spines). Instrument conditions that maximize signal intensity resulted in elevated 87Sr/86Sr isotope ratios in the bioapatite samples, related to a polyatomic interference (40Ca31P16O, 40Ar31P16O). Retuning instrument conditions to reduce oxide levels removed this interference, resulting in accurate 87Sr/86Sr ratios across all tissue samples. This method provides a novel, nonlethal alternative to otolith analysis to reconstruct fish life histories.

  3. Inference in hybrid Bayesian networks

    International Nuclear Information System (INIS)

    Langseth, Helge; Nielsen, Thomas D.; Rumi, Rafael; Salmeron, Antonio

    2009-01-01

    Since the 1980s, Bayesian networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability techniques (like fault trees and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (the so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability.

  4. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  5. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  6. Mastectomy Skin Necrosis After Breast Reconstruction: A Comparative Analysis Between Autologous Reconstruction and Implant-Based Reconstruction.

    Science.gov (United States)

    Sue, Gloria R; Lee, Gordon K

    2018-05-01

    Mastectomy skin necrosis is a significant problem after breast reconstruction. We sought to perform a comparative analysis on this complication between patients undergoing autologous breast reconstruction and patients undergoing 2-stage expander implant breast reconstruction. A retrospective review was performed on consecutive patients undergoing autologous breast reconstruction or 2-stage expander implant breast reconstruction by the senior author from 2006 through 2015. Patient demographic factors including age, body mass index, history of diabetes, history of smoking, and history of radiation to the breast were collected. Our primary outcome measure was mastectomy skin necrosis. Fisher exact test was used for statistical analysis between the 2 patient cohorts. The treatment patterns of mastectomy skin necrosis were then analyzed. We identified 204 patients who underwent autologous breast reconstruction and 293 patients who underwent 2-stage expander implant breast reconstruction. Patients undergoing autologous breast reconstruction were older, heavier, more likely to have diabetes, and more likely to have had prior radiation to the breast compared with patients undergoing implant-based reconstruction. The incidence of mastectomy skin necrosis was 30.4% of patients in the autologous group compared with only 10.6% of patients in the tissue expander group (P care in the autologous group, only 3.2% were treated with local wound care in the tissue expander group (P skin necrosis is significantly more likely to occur after autologous breast reconstruction compared with 2-stage expander implant-based breast reconstruction. Patients with autologous reconstructions are more readily treated with local wound care compared with patients with tissue expanders, who tended to require operative treatment of this complication. Patients considering breast reconstruction should be counseled appropriately regarding the differences in incidence and management of mastectomy skin

  7. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision

  8. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... dynamic domains. The communication needed between instances is achieved by means of a fill-in propagation scheme....

  9. A Bayesian framework for risk perception

    NARCIS (Netherlands)

    van Erp, H.R.N.

    2017-01-01

    We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy

  10. Human migration patterns in Yemen and implications for reconstructing prehistoric population movements.

    Directory of Open Access Journals (Sweden)

    Aida T Miró-Herrans

    Full Text Available Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102 and mean and median distances of migration (96 km and 26 km for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing

  11. Bayesian NL interpretation and learning

    NARCIS (Netherlands)

    Zeevat, H.

    2011-01-01

    Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language

  12. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  13. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  14. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  15. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  16. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...

  17. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  18. Bayesian estimates of linkage disequilibrium

    Directory of Open Access Journals (Sweden)

    Abad-Grau María M

    2007-06-01

    Full Text Available Abstract Background The maximum likelihood estimator of D' – a standard measure of linkage disequilibrium – is biased toward disequilibrium, and the bias is particularly evident in small samples and rare haplotypes. Results This paper proposes a Bayesian estimation of D' to address this problem. The reduction of the bias is achieved by using a prior distribution on the pair-wise associations between single nucleotide polymorphisms (SNPs that increases the likelihood of equilibrium with increasing physical distances between pairs of SNPs. We show how to compute the Bayesian estimate using a stochastic estimation based on MCMC methods, and also propose a numerical approximation to the Bayesian estimates that can be used to estimate patterns of LD in large datasets of SNPs. Conclusion Our Bayesian estimator of D' corrects the bias toward disequilibrium that affects the maximum likelihood estimator. A consequence of this feature is a more objective view about the extent of linkage disequilibrium in the human genome, and a more realistic number of tagging SNPs to fully exploit the power of genome wide association studies.

  19. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  20. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  1. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  2. Bayesian networks for evaluation of evidence from forensic entomology.

    Science.gov (United States)

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  3. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  4. Vaginal reconstruction

    International Nuclear Information System (INIS)

    Lesavoy, M.A.

    1985-01-01

    Vaginal reconstruction can be an uncomplicated and straightforward procedure when attention to detail is maintained. The Abbe-McIndoe procedure of lining the neovaginal canal with split-thickness skin grafts has become standard. The use of the inflatable Heyer-Schulte vaginal stent provides comfort to the patient and ease to the surgeon in maintaining approximation of the skin graft. For large vaginal and perineal defects, myocutaneous flaps such as the gracilis island have been extremely useful for correction of radiation-damaged tissue of the perineum or for the reconstruction of large ablative defects. Minimal morbidity and scarring ensue because the donor site can be closed primarily. With all vaginal reconstruction, a compliant patient is a necessity. The patient must wear a vaginal obturator for a minimum of 3 to 6 months postoperatively and is encouraged to use intercourse as an excellent obturator. In general, vaginal reconstruction can be an extremely gratifying procedure for both the functional and emotional well-being of patients

  5. ACL Reconstruction

    Science.gov (United States)

    ... in moderate exercise and recreational activities, or play sports that put less stress on the knees. ACL reconstruction is generally recommended if: You're an athlete and want to continue in your sport, especially if the sport involves jumping, cutting or ...

  6. Reconstructing the Limfjord’s history

    DEFF Research Database (Denmark)

    Philippsen, Bente

    The Limfjord is a sound in Northern Jutland, Denmark, connecting the North Sea with the Kattegatt. The complex interplay of eustatic sea level changes and isostatic land-rise caused the relative sea level of the region to fluctuate throughout the later part of the Holocene. Consequently, the regi...

  7. Sparse linear models: Variational approximate inference and Bayesian experimental design

    International Nuclear Information System (INIS)

    Seeger, Matthias W

    2009-01-01

    A wide range of problems such as signal reconstruction, denoising, source separation, feature selection, and graphical model search are addressed today by posterior maximization for linear models with sparsity-favouring prior distributions. The Bayesian posterior contains useful information far beyond its mode, which can be used to drive methods for sampling optimization (active learning), feature relevance ranking, or hyperparameter estimation, if only this representation of uncertainty can be approximated in a tractable manner. In this paper, we review recent results for variational sparse inference, and show that they share underlying computational primitives. We discuss how sampling optimization can be implemented as sequential Bayesian experimental design. While there has been tremendous recent activity to develop sparse estimation, little attendance has been given to sparse approximate inference. In this paper, we argue that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms. Moreover, since some variational inference methods have been given strong convex optimization characterizations recently, theoretical analysis may become possible, promising new insights into nonlinear experimental design.

  8. Sparse linear models: Variational approximate inference and Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Seeger, Matthias W [Saarland University and Max Planck Institute for Informatics, Campus E1.4, 66123 Saarbruecken (Germany)

    2009-12-01

    A wide range of problems such as signal reconstruction, denoising, source separation, feature selection, and graphical model search are addressed today by posterior maximization for linear models with sparsity-favouring prior distributions. The Bayesian posterior contains useful information far beyond its mode, which can be used to drive methods for sampling optimization (active learning), feature relevance ranking, or hyperparameter estimation, if only this representation of uncertainty can be approximated in a tractable manner. In this paper, we review recent results for variational sparse inference, and show that they share underlying computational primitives. We discuss how sampling optimization can be implemented as sequential Bayesian experimental design. While there has been tremendous recent activity to develop sparse estimation, little attendance has been given to sparse approximate inference. In this paper, we argue that many problems in practice, such as compressive sensing for real-world image reconstruction, are served much better by proper uncertainty approximations than by ever more aggressive sparse estimation algorithms. Moreover, since some variational inference methods have been given strong convex optimization characterizations recently, theoretical analysis may become possible, promising new insights into nonlinear experimental design.

  9. Parallelized Bayesian inversion for three-dimensional dental X-ray imaging.

    Science.gov (United States)

    Kolehmainen, Ville; Vanne, Antti; Siltanen, Samuli; Järvenpää, Seppo; Kaipio, Jari P; Lassas, Matti; Kalke, Martti

    2006-02-01

    Diagnostic and operational tasks based on dental radiology often require three-dimensional (3-D) information that is not available in a single X-ray projection image. Comprehensive 3-D information about tissues can be obtained by computerized tomography (CT) imaging. However, in dental imaging a conventional CT scan may not be available or practical because of high radiation dose, low-resolution or the cost of the CT scanner equipment. In this paper, we consider a novel type of 3-D imaging modality for dental radiology. We consider situations in which projection images of the teeth are taken from a few sparsely distributed projection directions using the dentist's regular (digital) X-ray equipment and the 3-D X-ray attenuation function is reconstructed. A complication in these experiments is that the reconstruction of the 3-D structure based on a few projection images becomes an ill-posed inverse problem. Bayesian inversion is a well suited framework for reconstruction from such incomplete data. In Bayesian inversion, the ill-posed reconstruction problem is formulated in a well-posed probabilistic form in which a priori information is used to compensate for the incomplete information of the projection data. In this paper we propose a Bayesian method for 3-D reconstruction in dental radiology. The method is partially based on Kolehmainen et al. 2003. The prior model for dental structures consist of a weighted l1 and total variation (TV)-prior together with the positivity prior. The inverse problem is stated as finding the maximum a posteriori (MAP) estimate. To make the 3-D reconstruction computationally feasible, a parallelized version of an optimization algorithm is implemented for a Beowulf cluster computer. The method is tested with projection data from dental specimens and patient data. Tomosynthetic reconstructions are given as reference for the proposed method.

  10. Bayesian estimation methods in metrology

    International Nuclear Information System (INIS)

    Cox, M.G.; Forbes, A.B.; Harris, P.M.

    2004-01-01

    In metrology -- the science of measurement -- a measurement result must be accompanied by a statement of its associated uncertainty. The degree of validity of a measurement result is determined by the validity of the uncertainty statement. In recognition of the importance of uncertainty evaluation, the International Standardization Organization in 1995 published the Guide to the Expression of Uncertainty in Measurement and the Guide has been widely adopted. The validity of uncertainty statements is tested in interlaboratory comparisons in which an artefact is measured by a number of laboratories and their measurement results compared. Since the introduction of the Mutual Recognition Arrangement, key comparisons are being undertaken to determine the degree of equivalence of laboratories for particular measurement tasks. In this paper, we discuss the possible development of the Guide to reflect Bayesian approaches and the evaluation of key comparison data using Bayesian estimation methods

  11. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  12. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  13. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  14. Robust Learning of High-dimensional Biological Networks with Bayesian Networks

    Science.gov (United States)

    Nägele, Andreas; Dejori, Mathäus; Stetter, Martin

    Structure learning of Bayesian networks applied to gene expression data has become a potentially useful method to estimate interactions between genes. However, the NP-hardness of Bayesian network structure learning renders the reconstruction of the full genetic network with thousands of genes unfeasible. Consequently, the maximal network size is usually restricted dramatically to a small set of genes (corresponding with variables in the Bayesian network). Although this feature reduction step makes structure learning computationally tractable, on the downside, the learned structure might be adversely affected due to the introduction of missing genes. Additionally, gene expression data are usually very sparse with respect to the number of samples, i.e., the number of genes is much greater than the number of different observations. Given these problems, learning robust network features from microarray data is a challenging task. This chapter presents several approaches tackling the robustness issue in order to obtain a more reliable estimation of learned network features.

  15. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  16. Maxillary reconstruction

    Directory of Open Access Journals (Sweden)

    Brown James

    2007-12-01

    Full Text Available This article aims to discuss the various defects that occur with maxillectomy with a full review of the literature and discussion of the advantages and disadvantages of the various techniques described. Reconstruction of the maxilla can be relatively simple for the standard low maxillectomy that does not involve the orbital floor (Class 2. In this situation the structure of the face is less damaged and the there are multiple reconstructive options for the restoration of the maxilla and dental alveolus. If the maxillectomy includes the orbit (Class 4 then problems involving the eye (enopthalmos, orbital dystopia, ectropion and diplopia are avoided which simplifies the reconstruction. Most controversy is associated with the maxillectomy that involves the orbital floor and dental alveolus (Class 3. A case is made for the use of the iliac crest with internal oblique as an ideal option but there are other methods, which may provide a similar result. A multidisciplinary approach to these patients is emphasised which should include a prosthodontist with a special expertise for these defects.

  17. Reconstructing the post-LGM decay of the Eurasian Ice Sheets with Ice Sheet Models; data-model comparison and focus on the Storfjorden (Svalbard) ice stream dynamics history

    Science.gov (United States)

    Petrini, Michele; Kirchner, Nina; Colleoni, Florence; Camerlenghi, Angelo; Rebesco, Michele; Lucchi, Renata G.; Forte, Emanuele; Colucci, Renato R.

    2017-04-01

    The challenge of reconstructing palaeo-ice sheets past growth and decay represent a critical task to better understand mechanisms of present and future global climate change. Last Glacial Maximum (LGM), and the subsequent deglaciation until Pre-Industrial time (PI) represent an excellent testing ground for numerical Ice Sheet Models (ISMs), due to the abundant data available that can be used in an ISM as boundary conditions, forcings or constraints to test the ISMs results. In our study, we simulate with ISMs the post-LGM decay of the Eurasian Ice Sheets, with a focus on the marine-based Svalbard-Barents Sea-Kara Sea Ice Sheet. In particular, we aim to reconstruct the Storfjorden ice stream dynamics history by comparing the model results with the marine geological data (MSGLs, GZWs, sediment cores analysis) available from the area, e.g., Pedrosa et al. 2011, Rebesco et al. 2011, 2013, Lucchi et al. 2013. Two hybrid SIA/SSA ISMs are employed, GRISLI, Ritz et al. 2001, and PSU, Pollard&DeConto 2012. These models differ mainly in the complexity with which grounding line migration is treated. Climate forcing is interpolated by means of climate indexes between LGM and PI climate. Regional climate indexes are constructed based on the non-accelerated deglaciation transient experiment carried out with CCSM3, Liu et al. 2009. Indexes representative of the climate evolution over Siberia, Svalbard and Scandinavia are employed. The impact of such refined representation as opposed to the common use of the NGRIP δ18O index for transient experiments is analysed. In this study, the ice-ocean interaction is crucial to reconstruct the Storfjorden ice stream dynamics history. To investigate the sensitivity of the ice shelf/stream retreat to ocean temperature, we allow for a space-time variation of basal melting under the ice shelves by testing two-equations implementations based on Martin et al. 2011 forced with simulated ocean temperature and salinity from the TraCE-21ka coupled

  18. Reconstructing community assembly in time and space reveals enemy escape in a Western Palearctic insect community.

    Science.gov (United States)

    Stone, Graham N; Lohse, Konrad; Nicholls, James A; Fuentes-Utrilla, Pablo; Sinclair, Frazer; Schönrogge, Karsten; Csóka, György; Melika, George; Nieves-Aldrey, Jose-Luis; Pujade-Villar, Juli; Tavakoli, Majide; Askew, Richard R; Hickerson, Michael J

    2012-03-20

    How geographically widespread biological communities assemble remains a major question in ecology. Do parallel population histories allow sustained interactions (such as host-parasite or plant-pollinator) among species, or do discordant histories necessarily interrupt them? Though few empirical data exist, these issues are central to our understanding of multispecies evolutionary dynamics. Here we use hierarchical approximate Bayesian analysis of DNA sequence data for 12 herbivores and 19 parasitoids to reconstruct the assembly of an insect community spanning the Western Palearctic and assess the support for alternative host tracking and ecological sorting hypotheses. We show that assembly occurred primarily by delayed host tracking from a shared eastern origin. Herbivores escaped their enemies for millennia before parasitoid pursuit restored initial associations, with generalist parasitoids no better able to track their hosts than specialists. In contrast, ecological sorting played only a minor role. Substantial turnover in host-parasitoid associations means that coevolution must have been diffuse, probably contributing to the parasitoid generalism seen in this and similar systems. Reintegration of parasitoids after host escape shows these communities to have been unsaturated throughout their history, arguing against major roles for parasitoid niche evolution or competition during community assembly. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  20. Robustness of ancestral sequence reconstruction to phylogenetic uncertainty.

    Science.gov (United States)

    Hanson-Smith, Victor; Kolaczkowski, Bryan; Thornton, Joseph W

    2010-09-01

    Ancestral sequence reconstruction (ASR) is widely used to formulate and test hypotheses about the sequences, functions, and structures of ancient genes. Ancestral sequences are usually inferred from an alignment of extant sequences using a maximum likelihood (ML) phylogenetic algorithm, which calculates the most likely ancestral sequence assuming a probabilistic model of sequence evolution and a specific phylogeny--typically the tree with the ML. The true phylogeny is seldom known with certainty, however. ML methods ignore this uncertainty, whereas Bayesian methods incorporate it by integrating the likelihood of each ancestral state over a distribution of possible trees. It is not known whether Bayesian approaches to phylogenetic uncertainty improve the accuracy of inferred ancestral sequences. Here, we use simulation-based experiments under both simplified and empirically derived conditions to compare the accuracy of ASR carried out using ML and Bayesian approaches. We show that incorporating phylogenetic uncertainty by integrating over topologies very rarely changes the inferred ancestral state and does not improve the accuracy of the reconstructed ancestral sequence. Ancestral state reconstructions are robust to uncertainty about the underlying tree because the conditions that produce phylogenetic uncertainty also make the ancestral state identical across plausible trees; conversely, the conditions under which different phylogenies yield different inferred ancestral states produce little or no ambiguity about the true phylogeny. Our results suggest that ML can produce accurate ASRs, even in the face of phylogenetic uncertainty. Using Bayesian integration to incorporate this uncertainty is neither necessary nor beneficial.

  1. A Bayesian model for binary Markov chains

    Directory of Open Access Journals (Sweden)

    Belkheir Essebbar

    2004-02-01

    Full Text Available This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.

  2. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  3. Biliary reconstruction options for bile duct stricture in patients with prior Roux-en-Y reconstruction.

    Science.gov (United States)

    Shah, Mihir M; Martin, Benjamin M; Stetler, Jamil L; Patel, Ankit D; Davis, S Scott; Lin, Edward; Sarmiento, Juan M

    2017-09-01

    Comprehensive description with illustrations of the 4 biliary reconstruction options for bile duct injury in patients with history of Roux-en-Y gastric bypass. Copyright © 2017 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  4. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  5. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  6. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  7. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  8. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  9. Bayesian analysis of magnetic island dynamics

    International Nuclear Information System (INIS)

    Preuss, R.; Maraschek, M.; Zohm, H.; Dose, V.

    2003-01-01

    We examine a first order differential equation with respect to time used to describe magnetic islands in magnetically confined plasmas. The free parameters of this equation are obtained by employing Bayesian probability theory. Additionally, a typical Bayesian change point is solved in the process of obtaining the data

  10. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...

  11. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  12. Using Bayesian belief networks in adaptive management.

    Science.gov (United States)

    J.B. Nyberg; B.G. Marcot; R. Sulyma

    2006-01-01

    Bayesian belief and decision networks are relatively new modeling methods that are especially well suited to adaptive-management applications, but they appear not to have been widely used in adaptive management to date. Bayesian belief networks (BBNs) can serve many purposes for practioners of adaptive management, from illustrating system relations conceptually to...

  13. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  14. Robust Bayesian detection of unmodelled bursts

    International Nuclear Information System (INIS)

    Searle, Antony C; Sutton, Patrick J; Tinto, Massimo; Woan, Graham

    2008-01-01

    We develop a Bayesian treatment of the problem of detecting unmodelled gravitational wave bursts using the new global network of interferometric detectors. We also compare this Bayesian treatment with existing coherent methods, and demonstrate that the existing methods make implicit assumptions on the distribution of signals that make them sub-optimal for realistic signal populations

  15. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  16. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  17. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  18. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  19. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  20. BELM: Bayesian extreme learning machine.

    Science.gov (United States)

    Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J

    2011-03-01

    The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.

  1. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.

  2. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  3. PET reconstruction

    International Nuclear Information System (INIS)

    O'Sullivan, F.; Pawitan, Y.; Harrison, R.L.; Lewellen, T.K.

    1990-01-01

    In statistical terms, filtered backprojection can be viewed as smoothed Least Squares (LS). In this paper, the authors report on improvement in LS resolution by: incorporating locally adaptive smoothers, imposing positivity and using statistical methods for optimal selection of the resolution parameter. The resulting algorithm has high computational efficiency relative to more elaborate Maximum Likelihood (ML) type techniques (i.e. EM with sieves). Practical aspects of the procedure are discussed in the context of PET and illustrations with computer simulated and real tomograph data are presented. The relative recovery coefficients for a 9mm sphere in a computer simulated hot-spot phantom range from .3 to .6 when the number of counts ranges from 10,000 to 640,000 respectively. The authors will also present results illustrating the relative efficacy of ML and LS reconstruction techniques

  4. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  5. Bayesian natural language semantics and pragmatics

    CERN Document Server

    Zeevat, Henk

    2015-01-01

    The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.

  6. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    Directory of Open Access Journals (Sweden)

    Benjamin W. Y. Lo

    2013-01-01

    Full Text Available Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH. Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients. Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs. Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  7. Hyainailourine and teratodontine cranial material from the late Eocene of Egypt and the application of parsimony and Bayesian methods to the phylogeny and biogeography of Hyaenodonta (Placentalia, Mammalia).

    Science.gov (United States)

    Borths, Matthew R; Holroyd, Patricia A; Seiffert, Erik R

    2016-01-01

    recovered from each phylogenetic method, we reconstructed the biogeographic history of Hyaenodonta using parsimony optimization (PO), likelihood optimization (LO), and Bayesian Binary Markov chain Monte Carlo (MCMC) to examine support for the Afro-Arabian origin of Hyaenodonta. Across all analyses, we found that Hyaenodonta most likely originated in Europe, rather than Afro-Arabia. The clade is estimated by tip-dating analysis to have undergone a rapid radiation in the Late Cretaceous and Paleocene; a radiation currently not documented by fossil evidence. During the Paleocene, lineages are reconstructed as dispersing to Asia, Afro-Arabia, and North America. The place of origin of Hyainailouroidea is likely Afro-Arabia according to the Bayesian topologies but it is ambiguous using parsimony. All topologies support the constituent clades-Hyainailourinae, Apterodontinae, and Teratodontinae-as Afro-Arabian and tip-dating estimates that each clade is established in Afro-Arabia by the middle Eocene.

  8. Modelling of JET diagnostics using Bayesian Graphical Models

    Energy Technology Data Exchange (ETDEWEB)

    Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.

    2011-07-01

    The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This

  9. [History of aesthetic rhinoplasty].

    Science.gov (United States)

    Nguyen, P S; Mazzola, R F

    2014-12-01

    One of the first surgical procedures described in the history of medicine is reconstructive surgery of the nose. Over the centuries, surgeons have developed techniques aimed at reconstructing noses amputated or traumatized by disease. The concept of aesthetic rhinoplasty was only introduced at the end of the 19th century. Since then, techniques have evolved toward constant ameliorations. Nowadays, this surgery is one of the most performed aesthetic procedures. Current technical sophistication is the result of over a century of history marked by many surgeons. All of these techniques derive from a detailed understanding of the anatomical nose from the surgical and artistic point of view. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  10. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  11. THz-SAR Vibrating Target Imaging via the Bayesian Method

    Directory of Open Access Journals (Sweden)

    Bin Deng

    2017-01-01

    Full Text Available Target vibration bears important information for target recognition, and terahertz, due to significant micro-Doppler effects, has strong advantages for remotely sensing vibrations. In this paper, the imaging characteristics of vibrating targets with THz-SAR are at first analyzed. An improved algorithm based on an excellent Bayesian approach, that is, the expansion-compression variance-component (ExCoV method, has been proposed for reconstructing scattering coefficients of vibrating targets, which provides more robust and efficient initialization and overcomes the deficiencies of sidelobes as well as artifacts arising from the traditional correlation method. A real vibration measurement experiment of idle cars was performed to validate the range model. Simulated SAR data of vibrating targets and a tank model in a real background in 220 GHz show good performance at low SNR. Rapidly evolving high-power terahertz devices will offer viable THz-SAR application at a distance of several kilometers.

  12. An Active Lattice Model in a Bayesian Framework

    DEFF Research Database (Denmark)

    Carstensen, Jens Michael

    1996-01-01

    A Markov Random Field is used as a structural model of a deformable rectangular lattice. When used as a template prior in a Bayesian framework this model is powerful for making inferences about lattice structures in images. The model assigns maximum probability to the perfect regular lattice...... by penalizing deviations in alignment and lattice node distance. The Markov random field represents prior knowledge about the lattice structure, and through an observation model that incorporates the visual appearance of the nodes, we can simulate realizations from the posterior distribution. A maximum...... a posteriori (MAP) estimate, found by simulated annealing, is used as the reconstructed lattice. The model was developed as a central part of an algorithm for automatic analylsis of genetic experiments, positioned in a lattice structure by a robot. The algorithm has been successfully applied to many images...

  13. Breast Reconstruction After Mastectomy

    Science.gov (United States)

    ... Cancer Prevention Genetics of Breast & Gynecologic Cancers Breast Cancer Screening Research Breast Reconstruction After Mastectomy On This Page What is breast reconstruction? How do surgeons use implants to reconstruct a woman’s breast? How do surgeons ...

  14. Breast reconstruction - implants

    Science.gov (United States)

    Breast implants surgery; Mastectomy - breast reconstruction with implants; Breast cancer - breast reconstruction with implants ... harder to find a tumor if your breast cancer comes back. Getting breast implants does not take as long as breast reconstruction ...

  15. CURRENT CONCEPTS IN ACL RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Freddie H. Fu

    2008-09-01

    Full Text Available Current Concepts in ACL Reconstruction is a complete reference text composed of the most thorough collection of topics on the ACL and its surgical reconstruction compiled, with contributions from some of the world's experts and most experienced ACL surgeons. Various procedures mentioned throughout the text are also demonstrated in an accompanying video CD-ROM. PURPOSE Composing a single, comprehensive and complete information source on ACL including basic sciences, clinical issues, latest concepts and surgical techniques, from evaluation to outcome, from history to future, editors and contributors have targeted to keep the audience pace with the latest concepts and techniques for the evaluation and the treatment of ACL injuries. FEATURES The text is composed of 27 chapters in 6 sections. The first section is mostly about basic sciences, also history of the ACL, imaging, clinical approach to adolescent and pediatric patients are subjected. In the second section, Graft Choices and Arthroscopy Portals for ACL Reconstruction are mentioned. The third section is about the technique and the outcome of the single-bundle ACL reconstruction. The fourth chapter includes the techniques and outcome of the double-bundle ACL reconstruction. In the fifth chapter revision, navigation technology, rehabilitation and the evaluation of the outcome of ACL reconstruction is subjected. The sixth/the last chapter is about the future advances to reach: What We Have Learned and the Future of ACL Reconstruction. AUDIENCE Orthopedic residents, sports traumatology and knee surgery fellows, orthopedic surgeons, also scientists in basic sciences or clinicians who are studying or planning a research on ACL forms the audience group of this book. ASSESSMENT This is the latest, the most complete and comprehensive textbook of ACL reconstruction produced by the editorial work up of two pioneer and masters "Freddie H. Fu MD and Steven B. Cohen MD" with the contribution of world

  16. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  17. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    , and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... selective and integrative roles, and thus cannot be easily extended to complex environments. We suggest that the resource bottleneck stems from the computational intractability of exact perceptual inference in complex settings, and that attention reflects an evolved mechanism for approximate inference which...... can be shaped to refine the local accuracy of perception. We show that this approach extends the simple picture of attention as prior, so as to provide a unified and computationally driven account of both selective and integrative attentional phenomena....

  18. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  19. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  20. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  1. Bayesian estimation in homodyne interferometry

    International Nuclear Information System (INIS)

    Olivares, Stefano; Paris, Matteo G A

    2009-01-01

    We address phase-shift estimation by means of squeezed vacuum probe and homodyne detection. We analyse Bayesian estimator, which is known to asymptotically saturate the classical Cramer-Rao bound to the variance, and discuss convergence looking at the a posteriori distribution as the number of measurements increases. We also suggest two feasible adaptive methods, acting on the squeezing parameter and/or the homodyne local oscillator phase, which allow us to optimize homodyne detection and approach the ultimate bound to precision imposed by the quantum Cramer-Rao theorem. The performances of our two-step methods are investigated by means of Monte Carlo simulated experiments with a small number of homodyne data, thus giving a quantitative meaning to the notion of asymptotic optimality.

  2. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.

  3. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  4. Image reconstruction under non-Gaussian noise

    DEFF Research Database (Denmark)

    Sciacchitano, Federica

    During acquisition and transmission, images are often blurred and corrupted by noise. One of the fundamental tasks of image processing is to reconstruct the clean image from a degraded version. The process of recovering the original image from the data is an example of inverse problem. Due...... to the ill-posedness of the problem, the simple inversion of the degradation model does not give any good reconstructions. Therefore, to deal with the ill-posedness it is necessary to use some prior information on the solution or the model and the Bayesian approach. Additive Gaussian noise has been......D thesis intends to solve some of the many open questions for image restoration under non-Gaussian noise. The two main kinds of noise studied in this PhD project are the impulse noise and the Cauchy noise. Impulse noise is due to for instance the malfunctioning pixel elements in the camera sensors, errors...

  5. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  6. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  7. Assessment of phylogenetic sensitivity for reconstructing HIV-1 epidemiological relationships.

    Science.gov (United States)

    Beloukas, Apostolos; Magiorkinis, Emmanouil; Magiorkinis, Gkikas; Zavitsanou, Asimina; Karamitros, Timokratis; Hatzakis, Angelos; Paraskevis, Dimitrios

    2012-06-01

    Phylogenetic analysis has been extensively used as a tool for the reconstruction of epidemiological relations for research or for forensic purposes. It was our objective to assess the sensitivity of different phylogenetic methods and various phylogenetic programs to reconstruct epidemiological links among HIV-1 infected patients that is the probability to reveal a true transmission relationship. Multiple datasets (90) were prepared consisting of HIV-1 sequences in protease (PR) and partial reverse transcriptase (RT) sampled from patients with documented epidemiological relationship (target population), and from unrelated individuals (control population) belonging to the same HIV-1 subtype as the target population. Each dataset varied regarding the number, the geographic origin and the transmission risk groups of the sequences among the control population. Phylogenetic trees were inferred by neighbor-joining (NJ), maximum likelihood heuristics (hML) and Bayesian methods. All clusters of sequences belonging to the target population were correctly reconstructed by NJ and Bayesian methods receiving high bootstrap and posterior probability (PP) support, respectively. On the other hand, TreePuzzle failed to reconstruct or provide significant support for several clusters; high puzzling step support was associated with the inclusion of control sequences from the same geographic area as the target population. In contrary, all clusters were correctly reconstructed by hML as implemented in PhyML 3.0 receiving high bootstrap support. We report that under the conditions of our study, hML using PhyML, NJ and Bayesian methods were the most sensitive for the reconstruction of epidemiological links mostly from sexually infected individuals. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Bayesian tomography by interacting Markov chains

    Science.gov (United States)

    Romary, T.

    2017-12-01

    In seismic tomography, we seek to determine the velocity of the undergound from noisy first arrival travel time observations. In most situations, this is an ill posed inverse problem that admits several unperfect solutions. Given an a priori distribution over the parameters of the velocity model, the Bayesian formulation allows to state this problem as a probabilistic one, with a solution under the form of a posterior distribution. The posterior distribution is generally high dimensional and may exhibit multimodality. Moreover, as it is known only up to a constant, the only sensible way to addressthis problem is to try to generate simulations from the posterior. The natural tools to perform these simulations are Monte Carlo Markov chains (MCMC). Classical implementations of MCMC algorithms generally suffer from slow mixing: the generated states are slow to enter the stationary regime, that is to fit the observations, and when one mode of the posterior is eventually identified, it may become difficult to visit others. Using a varying temperature parameter relaxing the constraint on the data may help to enter the stationary regime. Besides, the sequential nature of MCMC makes them ill fitted toparallel implementation. Running a large number of chains in parallel may be suboptimal as the information gathered by each chain is not mutualized. Parallel tempering (PT) can be seen as a first attempt to make parallel chains at different temperatures communicate but only exchange information between current states. In this talk, I will show that PT actually belongs to a general class of interacting Markov chains algorithm. I will also show that this class enables to design interacting schemes that can take advantage of the whole history of the chain, by authorizing exchanges toward already visited states. The algorithms will be illustrated with toy examples and an application to first arrival traveltime tomography.

  9. Proterozoic Milankovitch cycles and the history of the solar system.

    Science.gov (United States)

    Meyers, Stephen R; Malinverno, Alberto

    2018-06-19

    The geologic record of Milankovitch climate cycles provides a rich conceptual and temporal framework for evaluating Earth system evolution, bestowing a sharp lens through which to view our planet's history. However, the utility of these cycles for constraining the early Earth system is hindered by seemingly insurmountable uncertainties in our knowledge of solar system behavior (including Earth-Moon history), and poor temporal control for validation of cycle periods (e.g., from radioisotopic dates). Here we address these problems using a Bayesian inversion approach to quantitatively link astronomical theory with geologic observation, allowing a reconstruction of Proterozoic astronomical cycles, fundamental frequencies of the solar system, the precession constant, and the underlying geologic timescale, directly from stratigraphic data. Application of the approach to 1.4-billion-year-old rhythmites indicates a precession constant of 85.79 ± 2.72 arcsec/year (2σ), an Earth-Moon distance of 340,900 ± 2,600 km (2σ), and length of day of 18.68 ± 0.25 hours (2σ), with dominant climatic precession cycles of ∼14 ky and eccentricity cycles of ∼131 ky. The results confirm reduced tidal dissipation in the Proterozoic. A complementary analysis of Eocene rhythmites (∼55 Ma) illustrates how the approach offers a means to map out ancient solar system behavior and Earth-Moon history using the geologic archive. The method also provides robust quantitative uncertainties on the eccentricity and climatic precession periods, and derived astronomical timescales. As a consequence, the temporal resolution of ancient Earth system processes is enhanced, and our knowledge of early solar system dynamics is greatly improved.

  10. Assessing the accuracy of ancestral protein reconstruction methods.

    Directory of Open Access Journals (Sweden)

    Paul D Williams

    2006-06-01

    Full Text Available The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  11. Assessing the accuracy of ancestral protein reconstruction methods.

    Science.gov (United States)

    Williams, Paul D; Pollock, David D; Blackburne, Benjamin P; Goldstein, Richard A

    2006-06-23

    The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  12. Bayesian adaptive methods for clinical trials

    National Research Council Canada - National Science Library

    Berry, Scott M

    2011-01-01

    .... One is that Bayesian approaches implemented with the majority of their informative content coming from the current data, and not any external prior informa- tion, typically have good frequentist properties (e.g...

  13. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  14. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  15. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  16. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  17. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  18. Correct Bayesian and frequentist intervals are similar

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1986-01-01

    This paper argues that Bayesians and frequentists will normally reach numerically similar conclusions, when dealing with vague data or sparse data. It is shown that both statistical methodologies can deal reasonably with vague data. With sparse data, in many important practical cases Bayesian interval estimates and frequentist confidence intervals are approximately equal, although with discrete data the frequentist intervals are somewhat longer. This is not to say that the two methodologies are equally easy to use: The construction of a frequentist confidence interval may require new theoretical development. Bayesians methods typically require numerical integration, perhaps over many variables. Also, Bayesian can easily fall into the trap of over-optimism about their amount of prior knowledge. But in cases where both intervals are found correctly, the two intervals are usually not very different. (orig.)

  19. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  20. An overview on Approximate Bayesian computation*

    Directory of Open Access Journals (Sweden)

    Baragatti Meïli

    2014-01-01

    Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.

  1. Posterior consistency for Bayesian inverse problems through stability and regression results

    International Nuclear Information System (INIS)

    Vollmer, Sebastian J

    2013-01-01

    In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)

  2. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  3. A Bayesian classifier for symbol recognition

    OpenAIRE

    Barrat , Sabine; Tabbone , Salvatore; Nourrissier , Patrick

    2007-01-01

    URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...

  4. Bayesian Modeling of a Human MMORPG Player

    Science.gov (United States)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  5. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  6. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  7. MCMC for parameters estimation by bayesian approach

    International Nuclear Information System (INIS)

    Ait Saadi, H.; Ykhlef, F.; Guessoum, A.

    2011-01-01

    This article discusses the parameter estimation for dynamic system by a Bayesian approach associated with Markov Chain Monte Carlo methods (MCMC). The MCMC methods are powerful for approximating complex integrals, simulating joint distributions, and the estimation of marginal posterior distributions, or posterior means. The MetropolisHastings algorithm has been widely used in Bayesian inference to approximate posterior densities. Calibrating the proposal distribution is one of the main issues of MCMC simulation in order to accelerate the convergence.

  8. Bayesian Networks for Modeling Dredging Decisions

    Science.gov (United States)

    2011-10-01

    years, that algorithms have been developed to solve these problems efficiently. Most modern Bayesian network software uses junction tree (a.k.a. join... software was used to develop the network . This is by no means an exhaustive list of Bayesian network applications, but it is representative of recent...characteristic node (SCN), state- defining node ( SDN ), effect node (EFN), or value node. The five types of nodes can be described as follows: ERDC/EL TR-11

  9. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  10. A Bayesian Method for Weighted Sampling

    OpenAIRE

    Lo, Albert Y.

    1993-01-01

    Bayesian statistical inference for sampling from weighted distribution models is studied. Small-sample Bayesian bootstrap clone (BBC) approximations to the posterior distribution are discussed. A second-order property for the BBC in unweighted i.i.d. sampling is given. A consequence is that BBC approximations to a posterior distribution of the mean and to the sampling distribution of the sample average, can be made asymptotically accurate by a proper choice of the random variables that genera...

  11. Adaptive algebraic reconstruction technique

    International Nuclear Information System (INIS)

    Lu Wenkai; Yin Fangfang

    2004-01-01

    Algebraic reconstruction techniques (ART) are iterative procedures for reconstructing objects from their projections. It is proven that ART can be computationally efficient by carefully arranging the order in which the collected data are accessed during the reconstruction procedure and adaptively adjusting the relaxation parameters. In this paper, an adaptive algebraic reconstruction technique (AART), which adopts the same projection access scheme in multilevel scheme algebraic reconstruction technique (MLS-ART), is proposed. By introducing adaptive adjustment of the relaxation parameters during the reconstruction procedure, one-iteration AART can produce reconstructions with better quality, in comparison with one-iteration MLS-ART. Furthermore, AART outperforms MLS-ART with improved computational efficiency

  12. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  13. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  14. Aggregate Measures of Watershed Health from Reconstructed ...

    Science.gov (United States)

    Risk-based indices such as reliability, resilience, and vulnerability (R-R-V), have the potential to serve as watershed health assessment tools. Recent research has demonstrated the applicability of such indices for water quality (WQ) constituents such as total suspended solids and nutrients on an individual basis. However, the calculations can become tedious when time-series data for several WQ constituents have to be evaluated individually. Also, comparisons between locations with different sets of constituent data can prove difficult. In this study, data reconstruction using relevance vector machine algorithm was combined with dimensionality reduction via variational Bayesian noisy principal component analysis to reconstruct and condense sparse multidimensional WQ data sets into a single time series. The methodology allows incorporation of uncertainty in both the reconstruction and dimensionality-reduction steps. The R-R-V values were calculated using the aggregate time series at multiple locations within two Indiana watersheds. Results showed that uncertainty present in the reconstructed WQ data set propagates to the aggregate time series and subsequently to the aggregate R-R-V values as well. serving as motivating examples. Locations with different WQ constituents and different standards for impairment were successfully combined to provide aggregate measures of R-R-V values. Comparisons with individual constituent R-R-V values showed that v

  15. New method for determination of star formation history

    OpenAIRE

    Čeponis, Marius

    2017-01-01

    A New Method for Determination of Star Formation History Without stars there would not be any life and us. Almost all elements in our bodies are made in stars. Yet we still don‘t fully understand all the processes governing formation and evolution of stellar systems. Their star formation histories really help in trying to understand these processes. In this work a new Bayesian method for determination of star formation history is proposed. This method uses photometric data of resolved stars a...

  16. EXONEST: The Bayesian Exoplanetary Explorer

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2017-10-01

    Full Text Available The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects.

  17. Maximum entropy and Bayesian methods

    International Nuclear Information System (INIS)

    Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.

    1992-01-01

    Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come

  18. BREAST RECONSTRUCTIONS AFTER BREAST CANCER TREATING

    Directory of Open Access Journals (Sweden)

    Erik Vrabič

    2018-02-01

    Full Text Available Background. Breasts are an important symbol of physical beauty, feminity, mothering and sexual desire through the entire history of mankind. Lost of the whole or part of the breast is functional and aesthetic disturbance for woman. It is understandable, that the woman, who is concerned over breast loss, is as appropriate as another person´s concern over the loss of a limb or other body part. Before the 1960, breast reconstruction was considered as a dangerous procedure and it was almost prohibited. Considering the psychological importance of the breast in modern society, the possibility of breast reconstruction for the woman about to undergo a mastectomy is a comforting alternative. We can perform breast reconstruction with autologous tissue (autologous reconstruction, with breast implants and combination of both methods. For autologous reconstruction we can use local tissue (local flaps, or tissue from distant parts of the body (free vascular tissue transfer. Tissue expansion must be performed first, in many cases of breast reconstructions with breast implants. Conclusions. Possibility of breast reconstruction made a big progress last 3 decades. Today we are able to reconstruct almost every defect of the breast and the entire breast. Breast reconstruction rise the quality of life for breast cancer patients. Breast reconstruction is a team work of experts from many medicine specialites. In Slovenia we can offer breast reconstruction for breast cancer patients in Ljubljana, where plastic surgeons from Clinical Department for Plastic Surgery and Burns cooperate with oncologic surgeons. Ten years ago a similar cooperation between plastic surgeons and surgeons of the Centre for Breast Diseases was established in Maribor.

  19. Holocene flooding history of the Lower Tagus Valley (Portugal)

    NARCIS (Netherlands)

    Vis, G.-J.; Bohncke, S.J.P.; Schneider, H.; Kasse, C.; Coenraads-Nederveen, S.; Zuurbier, K.; Rozema, J.

    2010-01-01

    The present paper aims to reconstruct the Lower Tagus Valley flooding history for the last ca. 6500 a, to explore the suitability of pollen-based local vegetation development in supporting the reconstruction of flooding history, and to explain fluvial activity changes in terms of allogenic (climate,

  20. Authenticating History With Oral Narratives: The Example of Ekajuk ...

    African Journals Online (AJOL)

    It is generally accepted that oral narratives serve as a veritable means for historical reconstruction. This holds true, particularly in societies where written documents do not subsist. The Ekajuk community, though very warlike, is a relatively small community that lacks a written history. The attempt to reconstruct the history of ...

  1. Bohmian histories and decoherent histories

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    The predictions of the Bohmian and the decoherent (or consistent) histories formulations of the quantum mechanics of a closed system are compared for histories--sequences of alternatives at a series of times. For certain kinds of histories, Bohmian mechanics and decoherent histories may both be formulated in the same mathematical framework within which they can be compared. In that framework, Bohmian mechanics and decoherent histories represent a given history by different operators. Their predictions for the probabilities of histories of a closed system therefore generally differ. However, in an idealized model of measurement, the predictions of Bohmian mechanics and decoherent histories coincide for the probabilities of records of measurement outcomes. The formulations are thus difficult to distinguish experimentally. They may differ in their accounts of the past history of the Universe in quantum cosmology

  2. Bayesian analogy with relational transformations.

    Science.gov (United States)

    Lu, Hongjing; Chen, Dawn; Holyoak, Keith J

    2012-07-01

    How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy problems. We introduce Bayesian analogy with relational transformations (BART) and apply the model to the task of learning first-order comparative relations (e.g., larger, smaller, fiercer, meeker) from a set of animal pairs. Inputs are coded by vectors of continuous-valued features, based either on human magnitude ratings, normed feature ratings (De Deyne et al., 2008), or outputs of the topics model (Griffiths, Steyvers, & Tenenbaum, 2007). Bootstrapping from empirical priors, the model is able to induce first-order relations represented as probabilistic weight distributions, even when given positive examples only. These learned representations allow classification of novel instantiations of the relations and yield a symbolic distance effect of the sort obtained with both humans and other primates. BART then transforms its learned weight distributions by importance-guided mapping, thereby placing distinct dimensions into correspondence. These transformed representations allow BART to reliably solve 4-term analogies (e.g., larger:smaller::fiercer:meeker), a type of reasoning that is arguably specific to humans. Our results provide a proof-of-concept that structured analogies can be solved with representations induced from unstructured feature vectors by mechanisms that operate in a largely bottom-up fashion. We discuss potential implications for algorithmic and neural models of relational thinking, as well as for the evolution of abstract thought. Copyright 2012 APA, all rights reserved.

  3. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    Science.gov (United States)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  4. History Matters

    Institute of Scientific and Technical Information of China (English)

    2017-01-01

    In 2002, she began working as alecturer at Minzu University of China.Now, she teaches English, historicalliterature, ancient Chinese history,historical theory and method, ancientsocial history of China, ancient palacepolitical history of China and the historyof the Sui and Tang dynasties and thePeriod of Five Dynasties.

  5. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    Science.gov (United States)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  6. Bayesians versus frequentists a philosophical debate on statistical reasoning

    CERN Document Server

    Vallverdú, Jordi

    2016-01-01

    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept introduced in the book’s final section. This monograph will be of interest to philosophers and historians of science and students in related fields. Despite the mathematical nature of the topic, no statistical background is required, making the book a valuable read for anyone interested in the history of statistics and human cognition.

  7. Re-telling, Re-evaluating and Re-constructing

    Directory of Open Access Journals (Sweden)

    Gorana Tolja

    2013-11-01

    Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.

  8. Objective Bayesianism and the Maximum Entropy Principle

    Directory of Open Access Journals (Sweden)

    Jon Williamson

    2013-09-01

    Full Text Available Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabilities; they should be calibrated to our evidence of physical probabilities; and they should otherwise equivocate sufficiently between the basic propositions that we can express. The three norms are sometimes explicated by appealing to the maximum entropy principle, which says that a belief function should be a probability function, from all those that are calibrated to evidence, that has maximum entropy. However, the three norms of objective Bayesianism are usually justified in different ways. In this paper, we show that the three norms can all be subsumed under a single justification in terms of minimising worst-case expected loss. This, in turn, is equivalent to maximising a generalised notion of entropy. We suggest that requiring language invariance, in addition to minimising worst-case expected loss, motivates maximisation of standard entropy as opposed to maximisation of other instances of generalised entropy. Our argument also provides a qualified justification for updating degrees of belief by Bayesian conditionalisation. However, conditional probabilities play a less central part in the objective Bayesian account than they do under the subjective view of Bayesianism, leading to a reduced role for Bayes’ Theorem.

  9. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  10. Classifying emotion in Twitter using Bayesian network

    Science.gov (United States)

    Surya Asriadie, Muhammad; Syahrul Mubarok, Mohamad; Adiwijaya

    2018-03-01

    Language is used to express not only facts, but also emotions. Emotions are noticeable from behavior up to the social media statuses written by a person. Analysis of emotions in a text is done in a variety of media such as Twitter. This paper studies classification of emotions on twitter using Bayesian network because of its ability to model uncertainty and relationships between features. The result is two models based on Bayesian network which are Full Bayesian Network (FBN) and Bayesian Network with Mood Indicator (BNM). FBN is a massive Bayesian network where each word is treated as a node. The study shows the method used to train FBN is not very effective to create the best model and performs worse compared to Naive Bayes. F1-score for FBN is 53.71%, while for Naive Bayes is 54.07%. BNM is proposed as an alternative method which is based on the improvement of Multinomial Naive Bayes and has much lower computational complexity compared to FBN. Even though it’s not better compared to FBN, the resulting model successfully improves the performance of Multinomial Naive Bayes. F1-Score for Multinomial Naive Bayes model is 51.49%, while for BNM is 52.14%.

  11. Reconstruction of CT images by the Bayes- back projection method

    CERN Document Server

    Haruyama, M; Takase, M; Tobita, H

    2002-01-01

    In the course of research on quantitative assay of non-destructive measurement of radioactive waste, the have developed a unique program based on the Bayesian theory for reconstruction of transmission computed tomography (TCT) image. The reconstruction of cross-section images in the CT technology usually employs the Filtered Back Projection method. The new imaging reconstruction program reported here is based on the Bayesian Back Projection method, and it has a function of iterative improvement images by every step of measurement. Namely, this method has the capability of prompt display of a cross-section image corresponding to each angled projection data from every measurement. Hence, it is possible to observe an improved cross-section view by reflecting each projection data in almost real time. From the basic theory of Baysian Back Projection method, it can be not only applied to CT types of 1st, 2nd, and 3rd generation. This reported deals with a reconstruction program of cross-section images in the CT of ...

  12. The life history of Pseudometagea schwarzii, with a discussion of the evolution of endoparasitism and koinobiosis in Eucharitidae and Perilampidae (Chalcidoidea

    Directory of Open Access Journals (Sweden)

    John Heraty

    2013-10-01

    Full Text Available The immature stages and behavior of Pseudometagea schwarzii (Ashmead (Hymenoptera: Eucharitidae: Eucharitini are described, and the presence of an endoparasitic planidium that undergoes growth-feeding in the larva of the host ant (Lasius neoniger Emery is confirmed. Bayesian inference and parsimony ancestral state reconstruction are used to map the evolution of endoparasitism across the eucharitid-perilampid clade. Endoparasitism is proposed to have evolved independently three times within Eucharitidae, including once in Pseudometagea Ashmead, and at least twice in Perilampus Latreille. Endoparasitism is independent as an evolutionary trait from other life history traits such as differences in growth and development of the first-instar larva, hypermetamorphic larval morphology, and other biological traits, including koinobiosis.

  13. Model Selection in Historical Research Using Approximate Bayesian Computation

    Science.gov (United States)

    Rubio-Campillo, Xavier

    2016-01-01

    Formal Models and History Computational models are increasingly being used to study historical dynamics. This new trend, which could be named Model-Based History, makes use of recently published datasets and innovative quantitative methods to improve our understanding of past societies based on their written sources. The extensive use of formal models allows historians to re-evaluate hypotheses formulated decades ago and still subject to debate due to the lack of an adequate quantitative framework. The initiative has the potential to transform the discipline if it solves the challenges posed by the study of historical dynamics. These difficulties are based on the complexities of modelling social interaction, and the methodological issues raised by the evaluation of formal models against data with low sample size, high variance and strong fragmentation. Case Study This work examines an alternate approach to this evaluation based on a Bayesian-inspired model selection method. The validity of the classical Lanchester’s laws of combat is examined against a dataset comprising over a thousand battles spanning 300 years. Four variations of the basic equations are discussed, including the three most common formulations (linear, squared, and logarithmic) and a new variant introducing fatigue. Approximate Bayesian Computation is then used to infer both parameter values and model selection via Bayes Factors. Impact Results indicate decisive evidence favouring the new fatigue model. The interpretation of both parameter estimations and model selection provides new insights into the factors guiding the evolution of warfare. At a methodological level, the case study shows how model selection methods can be used to guide historical research through the comparison between existing hypotheses and empirical evidence. PMID:26730953

  14. A Sparse Bayesian Imaging Technique for Efficient Recovery of Reservoir Channels With Time-Lapse Seismic Measurements

    KAUST Repository

    Sana, Furrukh

    2016-06-01

    Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\\\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.

  15. A Sparse Bayesian Imaging Technique for Efficient Recovery of Reservoir Channels With Time-Lapse Seismic Measurements

    KAUST Repository

    Sana, Furrukh; Ravanelli, Fabio; Al-Naffouri, Tareq Y.; Hoteit, Ibrahim

    2016-01-01

    Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.

  16. Histories electromagnetism

    International Nuclear Information System (INIS)

    Burch, Aidan

    2004-01-01

    Working within the HPO (History Projection Operator) Consistent Histories formalism, we follow the work of Savvidou on (scalar) field theory [J. Math. Phys. 43, 3053 (2002)] and that of Savvidou and Anastopoulos on (first-class) constrained systems [Class. Quantum Gravt. 17, 2463 (2000)] to write a histories theory (both classical and quantum) of Electromagnetism. We focus particularly on the foliation-dependence of the histories phase space/Hilbert space and the action thereon of the two Poincare groups that arise in histories field theory. We quantize in the spirit of the Dirac scheme for constrained systems

  17. Life histories in occupational therapy clinical practice.

    Science.gov (United States)

    Frank, G

    1996-04-01

    This article defines and compares several narrative methods used to describe and interpret patients' lives. The biographical methods presented are case histories, life-charts, life histories, life stories, assisted autobiography, hermeneutic case reconstruction, therapeutic employment, volitional narratives, and occupational storytelling and story making. Emphasis is placed the clinician as a collaborator and interpreter of the patient's life through ongoing interactions and dialogue.

  18. Breast reconstruction - natural tissue

    Science.gov (United States)

    ... flap; TRAM; Latissimus muscle flap with a breast implant; DIEP flap; DIEAP flap; Gluteal free flap; Transverse upper gracilis flap; TUG; Mastectomy - breast reconstruction with natural tissue; Breast cancer - breast reconstruction with natural tissue

  19. Breast reconstruction after mastectomy

    Directory of Open Access Journals (Sweden)

    Daniel eSchmauss

    2016-01-01

    Full Text Available Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays breast reconstruction should be individualized at its best, first of all taking into consideration oncological aspects of the tumor, neo-/adjuvant treatment and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction, as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue, the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction.

  20. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  1. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  2. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  3. Bayesian estimation of dose rate effectiveness

    International Nuclear Information System (INIS)

    Arnish, J.J.; Groer, P.G.

    2000-01-01

    A Bayesian statistical method was used to quantify the effectiveness of high dose rate 137 Cs gamma radiation at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice. The Bayesian approach considers both the temporal and dose dependence of radiation carcinogenesis and total mortality. This paper provides the first direct estimation of dose rate effectiveness using Bayesian statistics. This statistical approach provides a quantitative description of the uncertainty of the factor characterising the dose rate in terms of a probability density function. The results show that a fixed dose from 137 Cs gamma radiation delivered at a high dose rate is more effective at inducing fatal mammary tumours and increasing the overall mortality rate in BALB/c female mice than the same dose delivered at a low dose rate. (author)

  4. A nonparametric Bayesian approach for genetic evaluation in ...

    African Journals Online (AJOL)

    South African Journal of Animal Science ... the Bayesian and Classical models, a Bayesian procedure is provided which allows these random ... data from the Elsenburg Dormer sheep stud and data from a simulation experiment are utilized. >

  5. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    .... Exploring these new developments, Bayesian Disease Mapping: Hierarchical Modeling in Spatial Epidemiology, Second Edition provides an up-to-date, cohesive account of the full range of Bayesian disease mapping methods and applications...

  6. The Bayesian Approach to Association

    Science.gov (United States)

    Arora, N. S.

    2017-12-01

    The Bayesian approach to Association focuses mainly on quantifying the physics of the domain. In the case of seismic association for instance let X be the set of all significant events (above some threshold) and their attributes, such as location, time, and magnitude, Y1 be the set of detections that are caused by significant events and their attributes such as seismic phase, arrival time, amplitude etc., Y2 be the set of detections that are not caused by significant events, and finally Y be the set of observed detections We would now define the joint distribution P(X, Y1, Y2, Y) = P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2) ; where the last term simply states that Y1 and Y2 are a partitioning of Y. Given the above joint distribution the inference problem is simply to find the X, Y1, and Y2 that maximizes posterior probability P(X, Y1, Y2| Y) which reduces to maximizing P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2). In this expression P(X) captures our prior belief about event locations. P(Y1 | X) captures notions of travel time, residual error distributions as well as detection and mis-detection probabilities. While P(Y2) captures the false detection rate of our seismic network. The elegance of this approach is that all of the assumptions are stated clearly in the model for P(X), P(Y1|X) and P(Y2). The implementation of the inference is merely a by-product of this model. In contrast some of the other methods such as GA hide a number of assumptions in the implementation details of the inference - such as the so called "driver cells." The other important aspect of this approach is that all seismic knowledge including knowledge from other domains such as infrasound and hydroacoustic can be included in the same model. So, we don't need to separately account for misdetections or merge seismic and infrasound events as a separate step. Finally, it should be noted that the objective of automatic association is to simplify the job of humans who are publishing seismic bulletins based on this

  7. A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    -posed estimation problem, where the reconstruction most often has been done by non-linear least squares techniques separately for each entity. The minmal model was originally specified for a single individual and does not combine several individuals with the advantage of estimating the metabolic portrait...... to a population-based model. The estimation of the parameters are efficiently implemented in a Bayesian approach where posterior inference is made through the use of Markov chain Monte Carlo techniques. Hereby we obtain a powerful and flexible modelling framework for regularizing the ill-posed estimation problem...

  8. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  9. Justifying Objective Bayesianism on Predicate Languages

    Directory of Open Access Journals (Sweden)

    Jürgen Landes

    2015-04-01

    Full Text Available Objective Bayesianism says that the strengths of one’s beliefs ought to be probabilities, calibrated to physical probabilities insofar as one has evidence of them, and otherwise sufficiently equivocal. These norms of belief are often explicated using the maximum entropy principle. In this paper we investigate the extent to which one can provide a unified justification of the objective Bayesian norms in the case in which the background language is a first-order predicate language, with a view to applying the resulting formalism to inductive logic. We show that the maximum entropy principle can be motivated largely in terms of minimising worst-case expected loss.

  10. Motion Learning Based on Bayesian Program Learning

    Directory of Open Access Journals (Sweden)

    Cheng Meng-Zhen

    2017-01-01

    Full Text Available The concept of virtual human has been highly anticipated since the 1980s. By using computer technology, Human motion simulation could generate authentic visual effect, which could cheat human eyes visually. Bayesian Program Learning train one or few motion data, generate new motion data by decomposing and combining. And the generated motion will be more realistic and natural than the traditional one.In this paper, Motion learning based on Bayesian program learning allows us to quickly generate new motion data, reduce workload, improve work efficiency, reduce the cost of motion capture, and improve the reusability of data.

  11. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  12. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  13. Bayesian parameter estimation in probabilistic risk assessment

    International Nuclear Information System (INIS)

    Siu, Nathan O.; Kelly, Dana L.

    1998-01-01

    Bayesian statistical methods are widely used in probabilistic risk assessment (PRA) because of their ability to provide useful estimates of model parameters when data are sparse and because the subjective probability framework, from which these methods are derived, is a natural framework to address the decision problems motivating PRA. This paper presents a tutorial on Bayesian parameter estimation especially relevant to PRA. It summarizes the philosophy behind these methods, approaches for constructing likelihood functions and prior distributions, some simple but realistic examples, and a variety of cautions and lessons regarding practical applications. References are also provided for more in-depth coverage of various topics

  14. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  15. Accelerated median root prior reconstruction for pinhole single-photon emission tomography (SPET)

    Energy Technology Data Exchange (ETDEWEB)

    Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, PO Box 553 FIN-33101, Tampere (Finland); Watabe, Hiroshi [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Iida, Hidehiro [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Kuikka, Jyrki T [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland)

    2003-07-07

    Pinhole collimation can be used to improve spatial resolution in SPET. However, the resolution improvement is achieved at the cost of reduced sensitivity, which leads to projection images with poor statistics. Images reconstructed from these projections using the maximum likelihood expectation maximization (ML-EM) algorithms, which have been used to reduce the artefacts generated by the filtered backprojection (FBP) based reconstruction, suffer from noise/bias trade-off: noise contaminates the images at high iteration numbers, whereas early abortion of the algorithm produces images that are excessively smooth and biased towards the initial estimate of the algorithm. To limit the noise accumulation we propose the use of the pinhole median root prior (PH-MRP) reconstruction algorithm. MRP is a Bayesian reconstruction method that has already been used in PET imaging and shown to possess good noise reduction and edge preservation properties. In this study the PH-MRP algorithm was accelerated with the ordered subsets (OS) procedure and compared to the FBP, OS-EM and conventional Bayesian reconstruction methods in terms of noise reduction, quantitative accuracy, edge preservation and visual quality. The results showed that the accelerated PH-MRP algorithm was very robust. It provided visually pleasing images with lower noise level than the FBP or OS-EM and with smaller bias and sharper edges than the conventional Bayesian methods.

  16. Ancestral sequence reconstruction in primate mitochondrial DNA: compositional bias and effect on functional inference.

    Science.gov (United States)

    Krishnan, Neeraja M; Seligmann, Hervé; Stewart, Caro-Beth; De Koning, A P Jason; Pollock, David D

    2004-10-01

    Reconstruction of ancestral DNA and amino acid sequences is an important means of inferring information about past evolutionary events. Such reconstructions suggest changes in molecular function and evolutionary processes over the course of evolution and are used to infer adaptation and convergence. Maximum likelihood (ML) is generally thought to provide relatively accurate reconstructed sequences compared to parsimony, but both methods lead to the inference of multiple directional changes in nucleotide frequencies in primate mitochondrial DNA (mtDNA). To better understand this surprising result, as well as to better understand how parsimony and ML differ, we constructed a series of computationally simple "conditional pathway" methods that differed in the number of substitutions allowed per site along each branch, and we also evaluated the entire Bayesian posterior frequency distribution of reconstructed ancestral states. We analyzed primate mitochondrial cytochrome b (Cyt-b) and cytochrome oxidase subunit I (COI) genes and found that ML reconstructs ancestral frequencies that are often more different from tip sequences than are parsimony reconstructions. In contrast, frequency reconstructions based on the posterior ensemble more closely resemble extant nucleotide frequencies. Simulations indicate that these differences in ancestral sequence inference are probably due to deterministic bias caused by high uncertainty in the optimization-based ancestral reconstruction methods (parsimony, ML, Bayesian maximum a posteriori). In contrast, ancestral nucleotide frequencies based on an average of the Bayesian set of credible ancestral sequences are much less biased. The methods involving simpler conditional pathway calculations have slightly reduced likelihood values compared to full likelihood calculations, but they can provide fairly unbiased nucleotide reconstructions and may be useful in more complex phylogenetic analyses than considered here due to their speed and

  17. Rethinking the history of common walnut (Juglans regia L.) in Europe: Its origins and human interactions.

    Science.gov (United States)

    Pollegioni, Paola; Woeste, Keith; Chiocchini, Francesca; Del Lungo, Stefano; Ciolfi, Marco; Olimpieri, Irene; Tortolano, Virginia; Clark, Jo; Hemery, Gabriel E; Mapelli, Sergio; Malvolti, Maria Emilia

    2017-01-01

    Common walnut (Juglans regia L) is an economically important species cultivated worldwide for its high-quality wood and nuts. It is generally accepted that after the last glaciation J. regia survived and grew in almost completely isolated stands in Asia, and that ancient humans dispersed walnuts across Asia and into new habitats via trade and cultural expansion. The history of walnut in Europe is a matter of debate, however. In this study, we estimated the genetic diversity and structure of 91 Eurasian walnut populations using 14 neutral microsatellites. By integrating fossil pollen, cultural, and historical data with population genetics, and approximate Bayesian analysis, we reconstructed the demographic history of walnut and its routes of dispersal across Europe. The genetic data confirmed the presence of walnut in glacial refugia in the Balkans and western Europe. We conclude that human-mediated admixture between Anatolian and Balkan walnut germplasm started in the Early Bronze Age, and between western Europe and the Balkans in eastern Europe during the Roman Empire. A population size expansion and subsequent decline in northeastern and western Europe was detected in the last five centuries. The actual distribution of walnut in Europe resulted from the combined effects of expansion/contraction from multiple refugia after the Last Glacial Maximum and its human exploitation over the last 5,000 years.

  18. The genetic diversity and evolutionary history of hepatitis C virus in Vietnam.

    Science.gov (United States)

    Li, Chunhua; Yuan, Manqiong; Lu, Ling; Lu, Teng; Xia, Wenjie; Pham, Van H; Vo, An X D; Nguyen, Mindie H; Abe, Kenji

    2014-11-01

    Vietnam has a unique history in association with foreign countries, which may have resulted in multiple introductions of the alien HCV strains to mix with those indigenous ones. In this study, we characterized the HCV sequences in Core-E1 and NS5B regions from 236 Vietnamese individuals. We identified multiple HCV lineages; 6a, 6 e, 6h, 6k, 6l, 6 o, 6p, and two novel variants may represent the indigenous strains; 1a was probably introduced from the US; 1b and 2a possibly originated in East Asia; while 2i, 2j, and 2m were likely brought by French explorers. We inferred the evolutionary history for four major subtypes: 1a, 1b, 6a, and 6 e. The obtained Bayesian Skyline Plots (BSPs) consistently showed the rapid HCV population growth from 1955 to 1963 until 1984 or after, corresponding to the era of the Vietnam War. We also estimated HCV growth rates and reconstructed phylogeographic trees for comparing subtypes 1a, 1b, and HCV-2. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. High migration rates shape the postglacial history of amphi-Atlantic bryophytes.

    Science.gov (United States)

    Désamoré, Aurélie; Patiño, Jairo; Mardulyn, Patrick; Mcdaniel, Stuart F; Zanatta, Florian; Laenen, Benjamin; Vanderpoorten, Alain

    2016-11-01

    Paleontological evidence and current patterns of angiosperm species richness suggest that European biota experienced more severe bottlenecks than North American ones during the last glacial maximum. How well this pattern fits other plant species is less clear. Bryophytes offer a unique opportunity to contrast the impact of the last glacial maximum in North America and Europe because about 60% of the European bryoflora is shared with North America. Here, we use population genetic analyses based on approximate Bayesian computation on eight amphi-Atlantic species to test the hypothesis that North American populations were less impacted by the last glacial maximum, exhibiting higher levels of genetic diversity than European ones and ultimately serving as a refugium for the postglacial recolonization of Europe. In contrast with this hypothesis, the best-fit demographic model involved similar patterns of population size contractions, comparable levels of genetic diversity and balanced migration rates between European and North American populations. Our results thus suggest that bryophytes have experienced comparable demographic glacial histories on both sides of the Atlantic. Although a weak, but significant genetic structure was systematically recovered between European and North American populations, evidence for migration from and towards both continents suggests that amphi-Atlantic bryophyte population may function as a metapopulation network. Reconstructing the biogeographic history of either North American or European bryophyte populations therefore requires a large, trans-Atlantic geographic framework. © 2016 John Wiley & Sons Ltd.

  20. Morphological homoplasy, life history evolution, and historical biogeography of plethodontid salamanders inferred from complete mitochondrial genomes

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Rachel Lockridge; Macey, J. Robert; Jaekel, Martin; Wake, David B.; Boore, Jeffrey L.

    2004-08-01

    The evolutionary history of the largest salamander family (Plethodontidae) is characterized by extreme morphological homoplasy. Analysis of the mechanisms generating such homoplasy requires an independent, molecular phylogeny. To this end, we sequenced 24 complete mitochondrial genomes (22 plethodontids and two outgroup taxa), added data for three species from GenBank, and performed partitioned and unpartitioned Bayesian, ML, and MP phylogenetic analyses. We explored four dataset partitioning strategies to account for evolutionary process heterogeneity among genes and codon positions, all of which yielded increased model likelihoods and decreased numbers of supported nodes in the topologies (PP > 0.95) relative to the unpartitioned analysis. Our phylogenetic analyses yielded congruent trees that contrast with the traditional morphology-based taxonomy; the monophyly of three out of four major groups is rejected. Reanalysis of current hypotheses in light of these new evolutionary relationships suggests that (1) a larval life history stage re-evolved from a direct-developing ancestor multiple times, (2) there is no phylogenetic support for the ''Out of Appalachia'' hypothesis of plethodontid origins, and (3) novel scenarios must be reconstructed for the convergent evolution of projectile tongues, reduction in toe number, and specialization for defensive tail loss. Some of these novel scenarios imply morphological transformation series that proceed in the opposite direction than was previously thought. In addition, they suggest surprising evolutionary lability in traits previously interpreted to be conservative.

  1. Accurate phylogenetic tree reconstruction from quartets: a heuristic approach.

    Science.gov (United States)

    Reaz, Rezwana; Bayzid, Md Shamsuzzoha; Rahman, M Sohel

    2014-01-01

    Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A 'quartet' is an unrooted tree over 4 taxa, hence the quartet-based supertree methods combine many 4-taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets.

  2. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  3. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  4. A gentle introduction to Bayesian analysis : Applications to developmental research

    NARCIS (Netherlands)

    van de Schoot, R.; Kaplan, D.; Denissen, J.J.A.; Asendorpf, J.B.; Neyer, F.J.; van Aken, M.A.G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  5. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA

  6. RIB FRACTURE AFTER BREAST RECONSTRUCTION WITH TISSUE EXPANDER

    Directory of Open Access Journals (Sweden)

    Uroš Ahčan

    2009-08-01

    Full Text Available Breast reconstruction with tissue expansion and later exchange with prosthesis is one of the most common methods for breast reconstruction. Women that are not appropriate for reconstruction with autologous tissue, women that have small breast or have a positive family history for breast cancer are most suitable for this type of reconstruction. Surgical technique of tissue expansion is relatively easy. Complications are rarely seen. With this case report we want to show the common, although occult existence of skeletal deformities in thorax after breast tissue expansion that may lead to rib fractures.

  7. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  8. Bayesian decision theory : A simple toy problem

    NARCIS (Netherlands)

    van Erp, H.R.N.; Linger, R.O.; van Gelder, P.H.A.J.M.

    2016-01-01

    We give here a comparison of the expected outcome theory, the expected utility theory, and the Bayesian decision theory, by way of a simple numerical toy problem in which we look at the investment willingness to avert a high impact low probability event. It will be found that for this toy problem

  9. Optimal Detection under the Restricted Bayesian Criterion

    Directory of Open Access Journals (Sweden)

    Shujun Liu

    2017-07-01

    Full Text Available This paper aims to find a suitable decision rule for a binary composite hypothesis-testing problem with a partial or coarse prior distribution. To alleviate the negative impact of the information uncertainty, a constraint is considered that the maximum conditional risk cannot be greater than a predefined value. Therefore, the objective of this paper becomes to find the optimal decision rule to minimize the Bayes risk under the constraint. By applying the Lagrange duality, the constrained optimization problem is transformed to an unconstrained optimization problem. In doing so, the restricted Bayesian decision rule is obtained as a classical Bayesian decision rule corresponding to a modified prior distribution. Based on this transformation, the optimal restricted Bayesian decision rule is analyzed and the corresponding algorithm is developed. Furthermore, the relation between the Bayes risk and the predefined value of the constraint is also discussed. The Bayes risk obtained via the restricted Bayesian decision rule is a strictly decreasing and convex function of the constraint on the maximum conditional risk. Finally, the numerical results including a detection example are presented and agree with the theoretical results.

  10. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. A strongly quasiconvex PAC-Bayesian bound

    DEFF Research Database (Denmark)

    Thiemann, Niklas; Igel, Christian; Wintenberger, Olivier

    2017-01-01

    We propose a new PAC-Bayesian bound and a way of constructing a hypothesis space, so that the bound is convex in the posterior distribution and also convex in a trade-off parameter between empirical performance of the posterior distribution and its complexity. The complexity is measured by the Ku...

  12. Multisnapshot Sparse Bayesian Learning for DOA

    DEFF Research Database (Denmark)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki

    2016-01-01

    The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source...

  13. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  14. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  15. Bayesian Averaging is Well-Temperated

    DEFF Research Database (Denmark)

    Hansen, Lars Kai

    2000-01-01

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation is l...

  16. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    En utilisant une etude exhaustive de Monte Carlo, nous prouvons que, moyennant une fonction perte generalisee adequate, on peut construire un estimateur Bayesien robuste du modele. Key words: Bayesian estimation; Extreme value; Generalized Fisher information; Gener- alized Pareto distribution; Monte Carlo; ...

  17. Evidence Estimation for Bayesian Partially Observed MRFs

    NARCIS (Netherlands)

    Chen, Y.; Welling, M.

    2013-01-01

    Bayesian estimation in Markov random fields is very hard due to the intractability of the partition function. The introduction of hidden units makes the situation even worse due to the presence of potentially very many modes in the posterior distribution. For the first time we propose a

  18. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.; Zander, Elmar; Rosić, Bojana V.; Litvinenko, Alexander; Pajonk, Oliver

    2016-01-01

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  19. A Bayesian perspective on some replacement strategies

    International Nuclear Information System (INIS)

    Mazzuchi, Thomas A.; Soyer, Refik

    1996-01-01

    In this paper we present a Bayesian decision theoretic approach for determining optimal replacement strategies. This approach enables us to formally incorporate, express, and update our uncertainty when determining optimal replacement strategies. We develop relevant expressions for both the block replacement protocol with minimal repair and the age replacement protocol and illustrate the use of our approach with real data

  20. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to be better than Fisher's approach because of its low misclassification error rate. Keywords: variance-covariance matrices, centroids, prior probability, mahalanobis distance, probability of misclassification ...