WorldWideScience

Sample records for bayesian history reconstruction

  1. Bayesian tomographic reconstruction of microsystems

    International Nuclear Information System (INIS)

    Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali

    2007-01-01

    The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast).To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique.In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations

  2. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  3. Reconstructing Community History

    Science.gov (United States)

    Shields, Amy

    2004-01-01

    History is alive and well in Lebanon, Missouri. Students in this small town in the southwest region of the state went above and beyond the community's expectations on this special project. This article describes this historical journey which began when students in a summer mural class reconstructed a mural that was originally created by a…

  4. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  5. A Bayesian account of quantum histories

    International Nuclear Information System (INIS)

    Marlow, Thomas

    2006-01-01

    We investigate whether quantum history theories can be consistent with Bayesian reasoning and whether such an analysis helps clarify the interpretation of such theories. First, we summarise and extend recent work categorising two different approaches to formalising multi-time measurements in quantum theory. The standard approach consists of describing an ordered series of measurements in terms of history propositions with non-additive 'probabilities.' The non-standard approach consists of defining multi-time measurements to consist of sets of exclusive and exhaustive history propositions and recovering the single-time exclusivity of results when discussing single-time history propositions. We analyse whether such history propositions can be consistent with Bayes' rule. We show that certain class of histories are given a natural Bayesian interpretation, namely, the linearly positive histories originally introduced by Goldstein and Page. Thus, we argue that this gives a certain amount of interpretational clarity to the non-standard approach. We also attempt a justification of our analysis using Cox's axioms of probability theory

  6. History of reconstructive rhinoplasty.

    Science.gov (United States)

    Mazzola, Isabella C; Mazzola, Riccardo F

    2014-06-01

    Amputation of the nose was practiced as a sign of humiliation to adulterers, thieves, and prisoners of war by certain ancient populations. To erase this disfigurement, numerous techniques were invented over the centuries. In India, where this injury was common, advancement cheek flaps were performed (around 600 BC). The forehead flap was introduced much later, probably around the 16th century. The Venetian adventurer Manuzzi, in writing a report about the Mughal Empire in the second half of the 17th century gave the description of the forehead rhinoplasty. Detailed information concerning the Indian forehead flap reached the Western world in 1794, thanks to a letter to the editor that appeared in the Gentleman's Magazine. From this episode, one can date the beginning of a widespread interest in rhinoplasty and in plastic surgery in general. In Europe, nasal reconstruction started in the 15th century in Sicily with the Brancas, initially with cheek flaps and then with arm flaps. At the beginning of the 16th century, rhinoplasty developed in Calabria (Southern Italy) with the Vianeos. In 1597, Gaspare Tagliacozzi, Professor of Surgery at Bologna, improved the arm flap technique and published a book entirely devoted to this art. He is considered the founder of plastic surgery. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  7. Bayesian image reconstruction for improving detection performance of muon tomography.

    Science.gov (United States)

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  8. More Bayesian Transdimensional Inversion for Thermal History Modelling (Invited)

    Science.gov (United States)

    Gallagher, K.

    2013-12-01

    Since the publication of Dodson (1973) quantifying the relationship between geochronogical ages and closure temperatures, an ongoing concern in thermochronology is reconstruction of thermal histories consistent with the measured data. Extracting this thermal history information is best treated as an inverse problem, given the complex relationship between the observations and the thermal history. When solving the inverse problem (i.e. finding thermal acceptable thermal histories), stochastic sampling methods have often been used, as these are relatively global when searching the model space. However, the issue remains how best to estimate those parts of the thermal history unconstrained by independent information, i.e. what is required to fit the data ? To solve this general problem, we use a Bayesian transdimensional Markov Chain Monte Carlo method and this has been integrated into user-friendly software, QTQt (Quantitative Thermochronology with Qt), which runs on both Macintosh and PC. The Bayesian approach allows us to consider a wide range of possible thermal history as general prior information on time, temperature (and temperature offset for multiple samples in a vertical profile). We can also incorporate more focussed geological constraints in terms of more specific priors. In this framework, it is the data themselves (and their errors) that determine the complexity of the thermal history solutions. For example, more precise data will justify a more complex solution, while more noisy data will be happy with simpler solutions. We can express complexity in terms of the number of time-temperature points defining the total thermal history. Another useful feature of this method is that was can easily deal with imprecise parameter values (e.g. kinetics, data errors), by drawing samples from a user specified probability distribution, rather than using a single value. Finally, the method can be applied to either single samples, or multiple samples (from a borehole or

  9. Food Reconstruction Using Isotopic Transferred Signals (FRUITS): A Bayesian Model for Diet Reconstruction

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.

    2014-01-01

    Roč. 9, č. 2 (2014), Art . no. e87436 E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014

  10. A Nonparametric Bayesian Approach For Emission Tomography Reconstruction

    International Nuclear Information System (INIS)

    Barat, Eric; Dautremer, Thomas

    2007-01-01

    We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on R k (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM

  11. Comparing nonparametric Bayesian tree priors for clonal reconstruction of tumors.

    Science.gov (United States)

    Deshwar, Amit G; Vembu, Shankar; Morris, Quaid

    2015-01-01

    Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstruction problem that improve the mixing time of Markov chains. In comparisons with the tree-structured stick breaking prior used in PhyloSub, we demonstrate superior mixing and running time using the treeCRP with our new split-merge procedures. We also show that given the same number of samples, TSSB and treeCRP have similar ability to recover the subclonal structure of a tumor…

  12. Bayesian surface reconstruction: examples of application and developments

    Science.gov (United States)

    Licciardi, Andrea; Piana Agostinetti, Nicola

    2016-04-01

    Interpolation of spatial data in 2D (surface reconstruction) is routinely performed in many fields of geoscience. It allows to construct new data points within the range of a discrete set of known data points (measurements). Classical interpolation schemes suffer of a number of limitations. 1) The level of smoothness in the solution has to be chosen by the user; 2) it is difficult to introduce weights to different data types when multiple datasets are used together and 3) the final solution is represented by a best-fit model without uncertainties. In this work we explore the capabilities of the probabilistic surface reconstruction tool first developed by Bodin et al. 2012. The Bayesian framework in which is developed allows to overcome the limitations of the conventional methods. The surface is parametrized with Voroni cells. We use a transdimensional and hierarchical scheme in which the total number of cells and the magnitude of the data errors are treated as unknowns in the inversion. Through synthetic tests, we show how the level of complexity (smoothness) in the reconstructed surface is directly inferred from the data, and how the solution, expressed in terms of probability distributions takes uncertainties into account. We then apply the algorithm to interpolate multiple datasets of Moho depth estimates for the British Isle to build a map of crustal thickness for the region. The retrieved map is compared with previously published results. Lastly, we adapt the algorithm to the case that the full probability distribution for each data point is known. The assumption of Gaussian distributed errors is overcome by considering each data point as a random variable with known probability distribution. We test this approach by interpolating 1D probability distributions for Vs along a linear array of seismic stations to create a pseudo 2D section. Bodin, T., M. Salmon, B. L. N. Kennett, and M. Sambridge (2012), Probabilistic surface reconstruction from multiple data sets

  13. Awakening the BALROG: BAyesian Location Reconstruction Of GRBs

    Science.gov (United States)

    Burgess, J. Michael; Yu, Hoi-Fung; Greiner, Jochen; Mortlock, Daniel J.

    2018-05-01

    The accurate spatial location of gamma-ray bursts (GRBs) is crucial for both accurately characterizing their spectra and follow-up observations by other instruments. The Fermi Gamma-ray Burst Monitor (GBM) has the largest field of view for detecting GRBs as it views the entire unocculted sky, but as a non-imaging instrument it relies on the relative count rates observed in each of its 14 detectors to localize transients. Improving its ability to accurately locate GRBs and other transients is vital to the paradigm of multimessenger astronomy, including the electromagnetic follow-up of gravitational wave signals. Here we present the BAyesian Location Reconstruction Of GRBs (BALROG) method for localizing and characterizing GBM transients. Our approach eliminates the systematics of previous approaches by simultaneously fitting for the location and spectrum of a source. It also correctly incorporates the uncertainties in the location of a transient into the spectral parameters and produces reliable positional uncertainties for both well-localized sources and those for which the GBM data cannot effectively constrain the position. While computationally expensive, BALROG can be implemented to enable quick follow-up of all GBM transient signals. Also, we identify possible response problems that require attention and caution when using standard, public GBM detector response matrices. Finally, we examine the effects of including the uncertainty in location on the spectral parameters of GRB 080916C. We find that spectral parameters change and no extra components are required when these effects are included in contrast to when we use a fixed location. This finding has the potential to alter both the GRB spectral catalogues and the reported spectral composition of some well-known GRBs.

  14. Diffusion archeology for diffusion progression history reconstruction.

    Science.gov (United States)

    Sefer, Emre; Kingsford, Carl

    2016-11-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.

  15. Efficient reconstruction of contaminant release history

    Energy Technology Data Exchange (ETDEWEB)

    Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL

    2009-01-01

    We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.

  16. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results

  17. Source reconstruction accuracy of MEG and EEG Bayesian inversion approaches.

    Directory of Open Access Journals (Sweden)

    Paolo Belardinelli

    Full Text Available Electro- and magnetoencephalography allow for non-invasive investigation of human brain activation and corresponding networks with high temporal resolution. Still, no correct network detection is possible without reliable source localization. In this paper, we examine four different source localization schemes under a common Variational Bayesian framework. A Bayesian approach to the Minimum Norm Model (MNM, an Empirical Bayesian Beamformer (EBB and two iterative Bayesian schemes (Automatic Relevance Determination (ARD and Greedy Search (GS are quantitatively compared. While EBB and MNM each use a single empirical prior, ARD and GS employ a library of anatomical priors that define possible source configurations. The localization performance was investigated as a function of (i the number of sources (one vs. two vs. three, (ii the signal to noise ratio (SNR; 5 levels and (iii the temporal correlation of source time courses (for the cases of two or three sources. We also tested whether the use of additional bilateral priors specifying source covariance for ARD and GS algorithms improved performance. Our results show that MNM proves effective only with single source configurations. EBB shows a spatial accuracy of few millimeters with high SNRs and low correlation between sources. In contrast, ARD and GS are more robust to noise and less affected by temporal correlations between sources. However, the spatial accuracy of ARD and GS is generally limited to the order of one centimeter. We found that the use of correlated covariance priors made no difference to ARD/GS performance.

  18. Sparse reconstruction using distribution agnostic bayesian matching pursuit

    KAUST Repository

    Masood, Mudassir

    2013-11-01

    A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.

  19. Reconstruction of the insulin secretion rate by Bayesian deconvolution

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    of the insulin secretion rate (ISR) can be done by solving a highly ill-posed deconvolution problem. We present a Bayesian methodology for the estimation of scaled densities of phase-type distributions via Markov chain Monte Carlo techniques, whereby closed form evaluation of ISR is possible. We demonstrate...... the methodology on simulated data concluding that the method seems as a promising alternative to existing methods where the ISR is considered as piecewise constant....

  20. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  1. Bayesian Inference for Source Reconstruction: A Real-World Application

    Science.gov (United States)

    2014-09-25

    for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope...consists of a comprehensive network of seismic, hydroacoustic , 6 International Scholarly Research Notices Figure 1: Locations of the three sampling stations...was at Chalk River Laboratories . infrasound, and radionuclide sensors as part of the verification regime of the Comprehensive Nuclear-Test- Ban Treaty

  2. Intensity-based bayesian framework for image reconstruction from sparse projection data

    International Nuclear Information System (INIS)

    Rashed, E.A.; Kudo, Hiroyuki

    2009-01-01

    This paper presents a Bayesian framework for iterative image reconstruction from projection data measured over a limited number of views. The classical Nyquist sampling rule yields the minimum number of projection views required for accurate reconstruction. However, challenges exist in many medical and industrial imaging applications in which the projection data is undersampled. Classical analytical reconstruction methods such as filtered backprojection (FBP) are not a good choice for use in such cases because the data undersampling in the angular range introduces aliasing and streak artifacts that degrade lesion detectability. In this paper, we propose a Bayesian framework for maximum likelihood-expectation maximization (ML-EM)-based iterative reconstruction methods that incorporates a priori knowledge obtained from expected intensity information. The proposed framework is based on the fact that, in tomographic imaging, it is often possible to expect a set of intensity values of the reconstructed object with relatively high accuracy. The image reconstruction cost function is modified to include the l 1 norm distance to the a priori known information. The proposed method has the potential to regularize the solution to reduce artifacts without missing lesions that cannot be expected from the a priori information. Numerical studies showed a significant improvement in image quality and lesion detectability under the condition of highly undersampled projection data. (author)

  3. Technical Note: Probabilistically constraining proxy age–depth models within a Bayesian hierarchical reconstruction model

    Directory of Open Access Journals (Sweden)

    J. P. Werner

    2015-03-01

    Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.

  4. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems.

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  5. A Bayesian nonparametric approach to reconstruction and prediction of random dynamical systems

    Science.gov (United States)

    Merkatas, Christos; Kaloudis, Konstantinos; Hatjispyros, Spyridon J.

    2017-06-01

    We propose a Bayesian nonparametric mixture model for the reconstruction and prediction from observed time series data, of discretized stochastic dynamical systems, based on Markov Chain Monte Carlo methods. Our results can be used by researchers in physical modeling interested in a fast and accurate estimation of low dimensional stochastic models when the size of the observed time series is small and the noise process (perhaps) is non-Gaussian. The inference procedure is demonstrated specifically in the case of polynomial maps of an arbitrary degree and when a Geometric Stick Breaking mixture process prior over the space of densities, is applied to the additive errors. Our method is parsimonious compared to Bayesian nonparametric techniques based on Dirichlet process mixtures, flexible and general. Simulations based on synthetic time series are presented.

  6. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  7. Bayesian hierarchical models for regional climate reconstructions of the last glacial maximum

    Science.gov (United States)

    Weitzel, Nils; Hense, Andreas; Ohlwein, Christian

    2017-04-01

    Spatio-temporal reconstructions of past climate are important for the understanding of the long term behavior of the climate system and the sensitivity to forcing changes. Unfortunately, they are subject to large uncertainties, have to deal with a complex proxy-climate structure, and a physically reasonable interpolation between the sparse proxy observations is difficult. Bayesian Hierarchical Models (BHMs) are a class of statistical models that is well suited for spatio-temporal reconstructions of past climate because they permit the inclusion of multiple sources of information (e.g. records from different proxy types, uncertain age information, output from climate simulations) and quantify uncertainties in a statistically rigorous way. BHMs in paleoclimatology typically consist of three stages which are modeled individually and are combined using Bayesian inference techniques. The data stage models the proxy-climate relation (often named transfer function), the process stage models the spatio-temporal distribution of the climate variables of interest, and the prior stage consists of prior distributions of the model parameters. For our BHMs, we translate well-known proxy-climate transfer functions for pollen to a Bayesian framework. In addition, we can include Gaussian distributed local climate information from preprocessed proxy records. The process stage combines physically reasonable spatial structures from prior distributions with proxy records which leads to a multivariate posterior probability distribution for the reconstructed climate variables. The prior distributions that constrain the possible spatial structure of the climate variables are calculated from climate simulation output. We present results from pseudoproxy tests as well as new regional reconstructions of temperatures for the last glacial maximum (LGM, ˜ 21,000 years BP). These reconstructions combine proxy data syntheses with information from climate simulations for the LGM that were

  8. Diffusion archeology for diffusion progression history reconstruction

    OpenAIRE

    Sefer, Emre; Kingsford, Carl

    2015-01-01

    Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring — perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial d...

  9. Neural network based real-time reconstruction of KSTAR magnetic equilibria with Bayesian-based preprocessing

    Science.gov (United States)

    Joung, Semin; Kwak, Sehyun; Ghim, Y.-C.

    2017-10-01

    Obtaining plasma shapes during tokamak discharges requires real-time estimation of magnetic configuration using Grad-Shafranov solver such as EFIT. Since off-line EFIT is computationally intensive and the real-time reconstructions do not agree with the results of off-line EFIT within our desired accuracy, we use a neural network to generate an off-line-quality equilibrium in real time. To train the neural network (two hidden layers with 30 and 20 nodes for each layer), we create database consisting of the magnetic signals and off-line EFIT results from KSTAR as inputs and targets, respectively. To compensate drifts in the magnetic signals originated from electronic circuits, we develop a Bayesian-based two-step real-time correction method. Additionally, we infer missing inputs, i.e. when some of inputs to the network are not usable, using Gaussian process coupled with Bayesian model. The likelihood of this model is determined based on the Maxwell's equations. We find that our network can withstand at least up to 20% of input errors. Note that this real-time reconstruction scheme is not yet implemented for KSTAR operation.

  10. Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction

    International Nuclear Information System (INIS)

    Sohlberg, Antti; Lensu, Sanna; Jolkkonen, Jukka; Tuomisto, Leena; Ruotsalainen, Ulla; Kuikka, Jyrki T.

    2004-01-01

    The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with 123 I-epidepride and 99m Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)

  11. Compressed-Sensing Reconstruction Based on Block Sparse Bayesian Learning in Bearing-Condition Monitoring.

    Science.gov (United States)

    Sun, Jiedi; Yu, Yang; Wen, Jiangtao

    2017-06-21

    Remote monitoring of bearing conditions, using wireless sensor network (WSN), is a developing trend in the industrial field. In complicated industrial environments, WSN face three main constraints: low energy, less memory, and low operational capability. Conventional data-compression methods, which concentrate on data compression only, cannot overcome these limitations. Aiming at these problems, this paper proposed a compressed data acquisition and reconstruction scheme based on Compressed Sensing (CS) which is a novel signal-processing technique and applied it for bearing conditions monitoring via WSN. The compressed data acquisition is realized by projection transformation and can greatly reduce the data volume, which needs the nodes to process and transmit. The reconstruction of original signals is achieved in the host computer by complicated algorithms. The bearing vibration signals not only exhibit the sparsity property, but also have specific structures. This paper introduced the block sparse Bayesian learning (BSBL) algorithm which works by utilizing the block property and inherent structures of signals to reconstruct CS sparsity coefficients of transform domains and further recover the original signals. By using the BSBL, CS reconstruction can be improved remarkably. Experiments and analyses showed that BSBL method has good performance and is suitable for practical bearing-condition monitoring.

  12. Bayesian penalized-likelihood reconstruction algorithm suppresses edge artifacts in PET reconstruction based on point-spread-function.

    Science.gov (United States)

    Yamaguchi, Shotaro; Wagatsuma, Kei; Miwa, Kenta; Ishii, Kenji; Inoue, Kazumasa; Fukushi, Masahiro

    2018-03-01

    The Bayesian penalized-likelihood reconstruction algorithm (BPL), Q.Clear, uses relative difference penalty as a regularization function to control image noise and the degree of edge-preservation in PET images. The present study aimed to determine the effects of suppression on edge artifacts due to point-spread-function (PSF) correction using a Q.Clear. Spheres of a cylindrical phantom contained a background of 5.3 kBq/mL of [ 18 F]FDG and sphere-to-background ratios (SBR) of 16, 8, 4 and 2. The background also contained water and spheres containing 21.2 kBq/mL of [ 18 F]FDG as non-background. All data were acquired using a Discovery PET/CT 710 and were reconstructed using three-dimensional ordered-subset expectation maximization with time-of-flight (TOF) and PSF correction (3D-OSEM), and Q.Clear with TOF (BPL). We investigated β-values of 200-800 using BPL. The PET images were analyzed using visual assessment and profile curves, edge variability and contrast recovery coefficients were measured. The 38- and 27-mm spheres were surrounded by higher radioactivity concentration when reconstructed with 3D-OSEM as opposed to BPL, which suppressed edge artifacts. Images of 10-mm spheres had sharper overshoot at high SBR and non-background when reconstructed with BPL. Although contrast recovery coefficients of 10-mm spheres in BPL decreased as a function of increasing β, higher penalty parameter decreased the overshoot. BPL is a feasible method for the suppression of edge artifacts of PSF correction, although this depends on SBR and sphere size. Overshoot associated with BPL caused overestimation in small spheres at high SBR. Higher penalty parameter in BPL can suppress overshoot more effectively. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. Accurate reconstruction of insertion-deletion histories by statistical phylogenetics.

    Directory of Open Access Journals (Sweden)

    Oscar Westesson

    Full Text Available The Multiple Sequence Alignment (MSA is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history, it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of several methods for reconstruction of indel history. The methods tested include a relatively new algorithm for statistical marginalization of MSAs that sums over a stochastically-sampled ensemble of the most probable evolutionary histories. For mammalian evolutionary parameters on several different trees, the single most likely history sampled by our algorithm appears less biased than histories reconstructed by other MSA methods. The algorithm can also be used for alignment-free inference, where the MSA is explicitly summed out of the analysis. As an illustration of our method, we discuss reconstruction of the evolutionary histories of human protein-coding genes.

  14. Bayesian reconstruction of the velocity distribution of weakly interacting massive particles from direct dark matter detection data

    Energy Technology Data Exchange (ETDEWEB)

    Shan, Chung-Lin, E-mail: clshan@phys.nthu.edu.tw [Physics Division, National Center for Theoretical Sciences No. 101, section 2, Kuang-Fu Road, Hsinchu City, 30013 Taiwan, R.O.C. (China)

    2014-08-01

    In this paper, we extended our earlier work on the reconstruction of the (time-averaged) one-dimensional velocity distribution of Galactic Weakly Interacting Massive Particles (WIMPs) and introduce the Bayesian fitting procedure to the theoretically predicted velocity distribution functions. In this reconstruction process, the (rough) velocity distribution reconstructed by using raw data from direct Dark Matter detection experiments directly, i.e. measured recoil energies, with one or more different target materials, has been used as ''reconstructed-input'' information. By assuming a fitting velocity distribution function and scanning the parameter space based on the Bayesian analysis, the astronomical characteristic parameters, e.g. the Solar and Earth's Galactic velocities, will be pinned down as the output results.

  15. Sparse Bayesian framework applied to 3D super-resolution reconstruction in fetal brain MRI

    Science.gov (United States)

    Becerra, Laura C.; Velasco Toledo, Nelson; Romero Castro, Eduardo

    2015-01-01

    Fetal Magnetic Resonance (FMR) is an imaging technique that is becoming increasingly important as allows assessing brain development and thus make an early diagnostic of congenital abnormalities, spatial resolution is limited by the short acquisition time and the unpredictable fetus movements, in consequence the resulting images are characterized by non-parallel projection planes composed by anisotropic voxels. The sparse Bayesian representation is a flexible strategy which is able to model complex relationships. The Super-resolution is approached as a regression problem, the main advantage is the capability to learn data relations from observations. Quantitative performance evaluation was carried out using synthetic images, the proposed method demonstrates a better reconstruction quality compared with standard interpolation approach. The presented method is a promising approach to improve the information quality related with the 3-D fetal brain structure. It is important because allows assessing brain development and thus make an early diagnostic of congenital abnormalities.

  16. Joint reconstruction of divergence times and life-history evolution in placental mammals using a phylogenetic covariance model.

    Science.gov (United States)

    Lartillot, Nicolas; Delsuc, Frédéric

    2012-06-01

    Violation of the molecular clock has been amply documented, and is now routinely taken into account by molecular dating methods. Comparative analyses have revealed a systematic component in rate variation, relating it to the evolution of life-history traits, such as body size or generation time. Life-history evolution can be reconstructed using Brownian models. However, the resulting estimates are typically uncertain, and potentially sensitive to the underlying assumptions. As a way of obtaining more accurate ancestral trait and divergence time reconstructions, correlations between life-history traits and substitution rates could be used as an additional source of information. In this direction, a Bayesian framework for jointly reconstructing rates, traits, and dates was previously introduced. Here, we apply this model to a 17 protein-coding gene alignment for 73 placental taxa. Our analysis indicates that the coupling between molecules and life history can lead to a reevaluation of ancestral life-history profiles, in particular for groups displaying convergent evolution in body size. However, reconstructions are sensitive to fossil calibrations and to the Brownian assumption. Altogether, our analysis suggests that further integrating inference of rates and traits might be particularly useful for neontological macroevolutionary comparative studies. © 2012 The Author(s). Evolution © 2012 The Society for the Study of Evolution.

  17. Bayesian 3D X-ray computed tomography image reconstruction with a scaled Gaussian mixture prior model

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali [Laboratoire des Signaux et Systèmes 3, Rue Joliot-Curie 91192 Gif sur Yvette (France)

    2015-01-13

    In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 512{sup 3} to 8192{sup 3} voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and H{sup t} (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections.

  18. Bayesian network reconstruction using systems genetics data: comparison of MCMC methods.

    Science.gov (United States)

    Tasaki, Shinya; Sauerwine, Ben; Hoff, Bruce; Toyoshiba, Hiroyoshi; Gaiteri, Chris; Chaibub Neto, Elias

    2015-04-01

    Reconstructing biological networks using high-throughput technologies has the potential to produce condition-specific interactomes. But are these reconstructed networks a reliable source of biological interactions? Do some network inference methods offer dramatically improved performance on certain types of networks? To facilitate the use of network inference methods in systems biology, we report a large-scale simulation study comparing the ability of Markov chain Monte Carlo (MCMC) samplers to reverse engineer Bayesian networks. The MCMC samplers we investigated included foundational and state-of-the-art Metropolis-Hastings and Gibbs sampling approaches, as well as novel samplers we have designed. To enable a comprehensive comparison, we simulated gene expression and genetics data from known network structures under a range of biologically plausible scenarios. We examine the overall quality of network inference via different methods, as well as how their performance is affected by network characteristics. Our simulations reveal that network size, edge density, and strength of gene-to-gene signaling are major parameters that differentiate the performance of various samplers. Specifically, more recent samplers including our novel methods outperform traditional samplers for highly interconnected large networks with strong gene-to-gene signaling. Our newly developed samplers show comparable or superior performance to the top existing methods. Moreover, this performance gain is strongest in networks with biologically oriented topology, which indicates that our novel samplers are suitable for inferring biological networks. The performance of MCMC samplers in this simulation framework can guide the choice of methods for network reconstruction using systems genetics data. Copyright © 2015 by the Genetics Society of America.

  19. Gene regulatory network reconstruction by Bayesian integration of prior knowledge and/or different experimental conditions.

    Science.gov (United States)

    Werhli, Adriano V; Husmeier, Dirk

    2008-06-01

    There have been various attempts to improve the reconstruction of gene regulatory networks from microarray data by the systematic integration of biological prior knowledge. Our approach is based on pioneering work by Imoto et al. where the prior knowledge is expressed in terms of energy functions, from which a prior distribution over network structures is obtained in the form of a Gibbs distribution. The hyperparameters of this distribution represent the weights associated with the prior knowledge relative to the data. We have derived and tested a Markov chain Monte Carlo (MCMC) scheme for sampling networks and hyperparameters simultaneously from the posterior distribution, thereby automatically learning how to trade off information from the prior knowledge and the data. We have extended this approach to a Bayesian coupling scheme for learning gene regulatory networks from a combination of related data sets, which were obtained under different experimental conditions and are therefore potentially associated with different active subpathways. The proposed coupling scheme is a compromise between (1) learning networks from the different subsets separately, whereby no information between the different experiments is shared; and (2) learning networks from a monolithic fusion of the individual data sets, which does not provide any mechanism for uncovering differences between the network structures associated with the different experimental conditions. We have assessed the viability of all proposed methods on data related to the Raf signaling pathway, generated both synthetically and in cytometry experiments.

  20. Reconstructing the massive black hole cosmic history through gravitational waves

    International Nuclear Information System (INIS)

    Sesana, Alberto; Gair, Jonathan; Berti, Emanuele; Volonteri, Marta

    2011-01-01

    The massive black holes we observe in galaxies today are the natural end-product of a complex evolutionary path, in which black holes seeded in proto-galaxies at high redshift grow through cosmic history via a sequence of mergers and accretion episodes. Electromagnetic observations probe a small subset of the population of massive black holes (namely, those that are active or those that are very close to us), but planned space-based gravitational wave observatories such as the Laser Interferometer Space Antenna (LISA) can measure the parameters of 'electromagnetically invisible' massive black holes out to high redshift. In this paper we introduce a Bayesian framework to analyze the information that can be gathered from a set of such measurements. Our goal is to connect a set of massive black hole binary merger observations to the underlying model of massive black hole formation. In other words, given a set of observed massive black hole coalescences, we assess what information can be extracted about the underlying massive black hole population model. For concreteness we consider ten specific models of massive black hole formation, chosen to probe four important (and largely unconstrained) aspects of the input physics used in structure formation simulations: seed formation, metallicity ''feedback'', accretion efficiency and accretion geometry. For the first time we allow for the possibility of 'model mixing', by drawing the observed population from some combination of the 'pure' models that have been simulated. A Bayesian analysis allows us to recover a posterior probability distribution for the ''mixing parameters'' that characterize the fractions of each model represented in the observed distribution. Our work shows that LISA has enormous potential to probe the underlying physics of structure formation.

  1. Reconstruction of a beech population bottleneck using archival demographic information and Bayesian analysis of genetic data.

    Science.gov (United States)

    Lander, Tonya A; Oddou-Muratorio, Sylvie; Prouillet-Leplat, Helene; Klein, Etienne K

    2011-12-01

    Range expansion and contraction has occurred in the history of most species and can seriously impact patterns of genetic diversity. Historical data about range change are rare and generally appropriate for studies at large scales, whereas the individual pollen and seed dispersal events that form the basis of geneflow and colonization generally occur at a local scale. In this study, we investigated range change in Fagus sylvatica on Mont Ventoux, France, using historical data from 1838 to the present and approximate Bayesian computation (ABC) analyses of genetic data. From the historical data, we identified a population minimum in 1845 and located remnant populations at least 200 years old. The ABC analysis selected a demographic scenario with three populations, corresponding to two remnant populations and one area of recent expansion. It also identified expansion from a smaller ancestral population but did not find that this expansion followed a population bottleneck, as suggested by the historical data. Despite a strong support to the selected scenario for our data set, the ABC approach showed a low power to discriminate among scenarios on average and a low ability to accurately estimate effective population sizes and divergence dates, probably due to the temporal scale of the study. This study provides an unusual opportunity to test ABC analysis in a system with a well-documented demographic history and identify discrepancies between the results of historical, classical population genetic and ABC analyses. The results also provide valuable insights into genetic processes at work at a fine spatial and temporal scale in range change and colonization. © 2011 Blackwell Publishing Ltd.

  2. Genetic and linguistic histories in Central Asia inferred using approximate Bayesian computations.

    Science.gov (United States)

    Thouzeau, Valentin; Mennecier, Philippe; Verdu, Paul; Austerlitz, Frédéric

    2017-08-30

    Linguistic and genetic data have been widely compared, but the histories underlying these descriptions are rarely jointly inferred. We developed a unique methodological framework for analysing jointly language diversity and genetic polymorphism data, to infer the past history of separation, exchange and admixture events among human populations. This method relies on approximate Bayesian computations that enable the identification of the most probable historical scenario underlying each type of data, and to infer the parameters of these scenarios. For this purpose, we developed a new computer program PopLingSim that simulates the evolution of linguistic diversity, which we coupled with an existing coalescent-based genetic simulation program, to simulate both linguistic and genetic data within a set of populations. Applying this new program to a wide linguistic and genetic dataset of Central Asia, we found several differences between linguistic and genetic histories. In particular, we showed how genetic and linguistic exchanges differed in the past in this area: some cultural exchanges were maintained without genetic exchanges. The methodological framework and the linguistic simulation tool developed here can be used in future work for disentangling complex linguistic and genetic evolutions underlying human biological and cultural histories. © 2017 The Author(s).

  3. Application of Bayesian neural networks to energy reconstruction in EAS experiments for ground-based TeV astrophysics

    Science.gov (United States)

    Bai, Y.; Xu, Y.; Pan, J.; Lan, J. Q.; Gao, W. W.

    2016-07-01

    A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers than the low energy ones.

  4. The confounding effect of population structure on bayesian skyline plot inferences of demographic history

    DEFF Research Database (Denmark)

    Heller, Rasmus; Chikhi, Lounes; Siegismund, Hans

    2013-01-01

    Many coalescent-based methods aiming to infer the demographic history of populations assume a single, isolated and panmictic population (i.e. a Wright-Fisher model). While this assumption may be reasonable under many conditions, several recent studies have shown that the results can be misleading...... when it is violated. Among the most widely applied demographic inference methods are Bayesian skyline plots (BSPs), which are used across a range of biological fields. Violations of the panmixia assumption are to be expected in many biological systems, but the consequences for skyline plot inferences...... the best scheme for inferring demographic change over a typical time scale. Analyses of data from a structured African buffalo population demonstrate how BSP results can be strengthened by simulations. We recommend that sample selection should be carefully considered in relation to population structure...

  5. Systematics and morphological evolution within the moss family Bryaceae: a comparison between parsimony and Bayesian methods for reconstruction of ancestral character states.

    Science.gov (United States)

    Pedersen, Niklas; Holyoak, David T; Newton, Angela E

    2007-06-01

    The Bryaceae are a large cosmopolitan moss family including genera of significant morphological and taxonomic complexity. Phylogenetic relationships within the Bryaceae were reconstructed based on DNA sequence data from all three genomic compartments. In addition, maximum parsimony and Bayesian inference were employed to reconstruct ancestral character states of 38 morphological plus four habitat characters and eight insertion/deletion events. The recovered phylogenetic patterns are generally in accord with previous phylogenies based on chloroplast DNA sequence data and three major clades are identified. The first clade comprises Bryum bornholmense, B. rubens, B. caespiticium, and Plagiobryum. This corroborates the hypothesis suggested by previous studies that several Bryum species are more closely related to Plagiobryum than to the core Bryum species. The second clade includes Acidodontium, Anomobryum, and Haplodontium, while the third clade contains the core Bryum species plus Imbribryum. Within the latter clade, B. subapiculatum and B. tenuisetum form the sister clade to Imbribryum. Reconstructions of ancestral character states under maximum parsimony and Bayesian inference suggest fourteen morphological synapomorphies for the ingroup and synapomorphies are detected for most clades within the ingroup. Maximum parsimony and Bayesian reconstructions of ancestral character states are mostly congruent although Bayesian inference shows that the posterior probability of ancestral character states may decrease dramatically when node support is taken into account. Bayesian inference also indicates that reconstructions may be ambiguous at internal nodes for highly polymorphic characters.

  6. Reconstructing the invasion history of Heracleum persicum (Apiaceae) into Europe

    Czech Academy of Sciences Publication Activity Database

    Rijal, D. P.; Alm, T.; Jahodová, Šárka; Stenoien, H. K.; Alsos, I. G.

    2015-01-01

    Roč. 24, č. 22 (2015), s. 5522-5543 ISSN 0962-1083 Institutional support: RVO:67985939 Keywords : approximate Bayesian computation * genetic variation * population genetics Subject RIV: EH - Ecology, Behaviour Impact factor: 5.947, year: 2015

  7. Reconstructing the population genetic history of the Caribbean.

    Directory of Open Access Journals (Sweden)

    Andrés Moreno-Estrada

    2013-11-01

    Full Text Available The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola, two mainland (Honduras, Colombia, and three Native South American (Yukpa, Bari, and Warao populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse--which today is reflected by shorter, older ancestry tracts--consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse--reflected by longer, younger tracts--is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub

  8. Reconstructing the Population Genetic History of the Caribbean

    Science.gov (United States)

    Moreno-Estrada, Andrés; Gravel, Simon; Zakharia, Fouad; McCauley, Jacob L.; Byrnes, Jake K.; Gignoux, Christopher R.; Ortiz-Tello, Patricia A.; Martínez, Ricardo J.; Hedges, Dale J.; Morris, Richard W.; Eng, Celeste; Sandoval, Karla; Acevedo-Acevedo, Suehelay; Norman, Paul J.; Layrisse, Zulay; Parham, Peter; Martínez-Cruzado, Juan Carlos; Burchard, Esteban González; Cuccaro, Michael L.; Martin, Eden R.; Bustamante, Carlos D.

    2013-01-01

    The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola), two mainland (Honduras, Colombia), and three Native South American (Yukpa, Bari, and Warao) populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA) method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse—which today is reflected by shorter, older ancestry tracts—consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse—reflected by longer, younger tracts—is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub

  9. Reconstructing the population genetic history of the Caribbean.

    Science.gov (United States)

    Moreno-Estrada, Andrés; Gravel, Simon; Zakharia, Fouad; McCauley, Jacob L; Byrnes, Jake K; Gignoux, Christopher R; Ortiz-Tello, Patricia A; Martínez, Ricardo J; Hedges, Dale J; Morris, Richard W; Eng, Celeste; Sandoval, Karla; Acevedo-Acevedo, Suehelay; Norman, Paul J; Layrisse, Zulay; Parham, Peter; Martínez-Cruzado, Juan Carlos; Burchard, Esteban González; Cuccaro, Michael L; Martin, Eden R; Bustamante, Carlos D

    2013-11-01

    The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola), two mainland (Honduras, Colombia), and three Native South American (Yukpa, Bari, and Warao) populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA) method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse--which today is reflected by shorter, older ancestry tracts--consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse--reflected by longer, younger tracts--is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub-continental source

  10. The Phylogeographic History of the New World Screwworm Fly, Inferred by Approximate Bayesian Computation Analysis

    Science.gov (United States)

    Azeredo-Espin, Ana Maria L.

    2013-01-01

    Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC) analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP). The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP). The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests. PMID:24098436

  11. Reconstruction of prehistoric pottery use from fatty acid carbon isotope signatures using Bayesian inference

    Czech Academy of Sciences Publication Activity Database

    Fernandes, R.; Eley, Y.; Brabec, Marek; Lucquin, A.; Millard, A.; Craig, O.E.

    2018-01-01

    Roč. 117, March (2018), s. 31-42 ISSN 0146-6380 Institutional support: RVO:67985807 Keywords : Fatty acids * carbon isotopes * pottery use * Bayesian mixing models * FRUITS Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.081, year: 2016

  12. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2011-01-01

    We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical...... models. Analysis of simulated and real EEG data provide evidence that reconstruction of the forward model leads to improved source estimates....

  13. Exploring Approximate Bayesian Computation for inferring recent demographic history with genomic markers in nonmodel species.

    Science.gov (United States)

    Elleouet, Joane S; Aitken, Sally N

    2018-01-22

    Approximate Bayesian computation (ABC) is widely used to infer demographic history of populations and species using DNA markers. Genomic markers can now be developed for nonmodel species using reduced representation library (RRL) sequencing methods that select a fraction of the genome using targeted sequence capture or restriction enzymes (genotyping-by-sequencing, GBS). We explored the influence of marker number and length, knowledge of gametic phase, and tradeoffs between sample size and sequencing depth on the quality of demographic inferences performed with ABC. We focused on two-population models of recent spatial expansion with varying numbers of unknown parameters. Performing ABC on simulated data sets with known parameter values, we found that the timing of a recent spatial expansion event could be precisely estimated in a three-parameter model. Taking into account uncertainty in parameters such as initial population size and migration rate collectively decreased the precision of inferences dramatically. Phasing haplotypes did not improve results, regardless of sequence length. Numerous short sequences were as valuable as fewer, longer sequences, and performed best when a large sample size was sequenced at low individual depth, even when sequencing errors were added. ABC results were similar to results obtained with an alternative method based on the site frequency spectrum (SFS) when performed with unphased GBS-type markers. We conclude that unphased GBS-type data sets can be sufficient to precisely infer simple demographic models, and discuss possible improvements for the use of ABC with genomic data. © 2018 John Wiley & Sons Ltd.

  14. The resolved star formation history of M51a through successive Bayesian marginalization

    Science.gov (United States)

    Martínez-García, Eric E.; Bruzual, Gustavo; Magris C., Gladis; González-Lópezlira, Rosa A.

    2018-02-01

    We have obtained the time and space-resolved star formation history (SFH) of M51a (NGC 5194) by fitting Galaxy Evolution Explorer (GALEX), Sloan Digital Sky Survey and near-infrared pixel-by-pixel photometry to a comprehensive library of stellar population synthesis models drawn from the Synthetic Spectral Atlas of Galaxies (SSAG). We fit for each space-resolved element (pixel) an independent model where the SFH is averaged in 137 age bins, each one 100 Myr wide. We used the Bayesian Successive Priors (BSP) algorithm to mitigate the bias in the present-day spatial mass distribution. We test BSP with different prior probability distribution functions (PDFs); this exercise suggests that the best prior PDF is the one concordant with the spatial distribution of the stellar mass as inferred from the near-infrared images. We also demonstrate that varying the implicit prior PDF of the SFH in SSAG does not affect the results. By summing the contributions to the global star formation rate of each pixel, at each age bin, we have assembled the resolved SFH of the whole galaxy. According to these results, the star formation rate of M51a was exponentially increasing for the first 10 Gyr after the big bang, and then turned into an exponentially decreasing function until the present day. Superimposed, we find a main burst of star formation at t ≈ 11.9 Gyr after the big bang.

  15. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  16. The phylogeographic history of the new world screwworm fly, inferred by approximate bayesian computation analysis.

    Directory of Open Access Journals (Sweden)

    Pablo Fresia

    Full Text Available Insect pest phylogeography might be shaped both by biogeographic events and by human influence. Here, we conducted an approximate Bayesian computation (ABC analysis to investigate the phylogeography of the New World screwworm fly, Cochliomyia hominivorax, with the aim of understanding its population history and its order and time of divergence. Our ABC analysis supports that populations spread from North to South in the Americas, in at least two different moments. The first split occurred between the North/Central American and South American populations in the end of the Last Glacial Maximum (15,300-19,000 YBP. The second split occurred between the North and South Amazonian populations in the transition between the Pleistocene and the Holocene eras (9,100-11,000 YBP. The species also experienced population expansion. Phylogenetic analysis likewise suggests this north to south colonization and Maxent models suggest an increase in the number of suitable areas in South America from the past to present. We found that the phylogeographic patterns observed in C. hominivorax cannot be explained only by climatic oscillations and can be connected to host population histories. Interestingly we found these patterns are very coincident with general patterns of ancient human movements in the Americas, suggesting that humans might have played a crucial role in shaping the distribution and population structure of this insect pest. This work presents the first hypothesis test regarding the processes that shaped the current phylogeographic structure of C. hominivorax and represents an alternate perspective on investigating the problem of insect pests.

  17. Quantitative diet reconstruction of a Neolithic population using a Bayesian mixing model (FRUITS): The case study of Ostorf (Germany).

    Science.gov (United States)

    Fernandes, Ricardo; Grootes, Pieter; Nadeau, Marie-Josée; Nehlich, Olaf

    2015-07-14

    The island cemetery site of Ostorf (Germany) consists of individual human graves containing Funnel Beaker ceramics dating to the Early or Middle Neolithic. However, previous isotope and radiocarbon analysis demonstrated that the Ostorf individuals had a diet rich in freshwater fish. The present study was undertaken to quantitatively reconstruct the diet of the Ostorf population and establish if dietary habits are consistent with the traditional characterization of a Neolithic diet. Quantitative diet reconstruction was achieved through a novel approach consisting of the use of the Bayesian mixing model Food Reconstruction Using Isotopic Transferred Signals (FRUITS) to model isotope measurements from multiple dietary proxies (δ 13 C collagen , δ 15 N collagen , δ 13 C bioapatite , δ 34 S methione , 14 C collagen ). The accuracy of model estimates was verified by comparing the agreement between observed and estimated human dietary radiocarbon reservoir effects. Quantitative diet reconstruction estimates confirm that the Ostorf individuals had a high protein intake due to the consumption of fish and terrestrial animal products. However, FRUITS estimates also show that plant foods represented a significant source of calories. Observed and estimated human dietary radiocarbon reservoir effects are in good agreement provided that the aquatic reservoir effect at Lake Ostorf is taken as reference. The Ostorf population apparently adopted elements associated with a Neolithic culture but adapted to available local food resources and implemented a subsistence strategy that involved a large proportion of fish and terrestrial meat consumption. This case study exemplifies the diversity of subsistence strategies followed during the Neolithic. Am J Phys Anthropol, 2015. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  18. Reconstructing the insulin secretion rate by Bayesian deconvolution of phase-type densities

    DEFF Research Database (Denmark)

    Andersen, Kim Emil; Højbjerre, Malene

    2005-01-01

    of the insulin secretion rate (ISR) can be done by solving a highly ill-posed deconvolution problem. We represent the ISR, the C-peptide concentration and the convolution kernel as scaled phase-type densities and develop a Bayesian methodology for estimating such densities via Markov chain Monte Carlo techniques....... Hereby closed form evaluation of ISR is possible. We demonstrate the methodology on experimental data from healthy subjects and obtain results which are more realistic than recently reported conclusions based upon methods where the ISR is considered as piecewise constant....

  19. Bayesian sparse-based reconstruction in bioluminescence tomography improves localization accuracy and reduces computational time.

    Science.gov (United States)

    Feng, Jinchao; Jia, Kebin; Li, Zhe; Pogue, Brian W; Yang, Mingjie; Wang, Yaqi

    2017-11-09

    Bioluminescence tomography (BLT) provides fundamental insight into biological processes in vivo. To fully realize its potential, it is important to develop image reconstruction algorithms that accurately visualize and quantify the bioluminescence signals taking advantage of limited boundary measurements. In this study, a new 2-step reconstruction method for BLT is developed by taking advantage of the sparse a priori information of the light emission using multispectral measurements. The first step infers a wavelength-dependent prior by using all multi-wavelength measurements. The second step reconstructs the source distribution based on this developed prior. Simulation, phantom and in vivo results were performed to assess and compare the accuracy and the computational efficiency of this algorithm with conventional sparsity-promoting BLT reconstruction algorithms, and results indicate that the position errors are reduced from a few millimeters down to submillimeter, and reconstruction time is reduced by 3 orders of magnitude in most cases, to just under a few seconds. The recovery of single objects and multiple (2 and 3) small objects is simulated, and the recovery of images of a mouse phantom and an experimental animal with an existing luminescent source in the abdomen is demonstrated. Matlab code is available at https://github.com/jinchaofeng/code/tree/master. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Strontium isotopes and the reconstruction of the Chaco regional system: evaluating uncertainty with Bayesian mixing models.

    Science.gov (United States)

    Drake, Brandon Lee; Wills, Wirt H; Hamilton, Marian I; Dorshow, Wetherbee

    2014-01-01

    Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated) and Bayesian methods (to address uncertainty in geochemical source attribution). It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.

  1. Strontium isotopes and the reconstruction of the Chaco regional system: evaluating uncertainty with Bayesian mixing models.

    Directory of Open Access Journals (Sweden)

    Brandon Lee Drake

    Full Text Available Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated and Bayesian methods (to address uncertainty in geochemical source attribution. It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.

  2. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2009-01-01

    In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....

  3. TreeTime: an extensible C++ software package for Bayesian phylogeny reconstruction with time-calibration.

    Science.gov (United States)

    Himmelmann, Lin; Metzler, Dirk

    2009-09-15

    For the estimation of phylogenetic trees from molecular data, it is worthwhile to take prior paleontologic knowledge into account, if available. To calibrate the branch lengths of the tree with times assigned to geo-historical events or fossils, it is necessary to select a relaxed molecular clock model to specify how mutation rates can change along the phylogeny. We present the software TreeTime for Bayesian phylogeny estimation. It can take prior information about the topology of the tree and about branching times into account. Several relaxed molecular clock models are implemented in TreeTime. TreeTime is written in C++ and designed to be efficient and extensible. TreeTime is freely available from http://evol.bio.lmu.de/statgen/software/treetime under the terms of the GNU General Public Licence (GPL, version 3 or later).

  4. A Bayesian Supertree Model for Genome-Wide Species Tree Reconstruction.

    Science.gov (United States)

    De Oliveira Martins, Leonardo; Mallo, Diego; Posada, David

    2016-05-01

    Current phylogenomic data sets highlight the need for species tree methods able to deal with several sources of gene tree/species tree incongruence. At the same time, we need to make most use of all available data. Most species tree methods deal with single processes of phylogenetic discordance, namely, gene duplication and loss, incomplete lineage sorting (ILS) or horizontal gene transfer. In this manuscript, we address the problem of species tree inference from multilocus, genome-wide data sets regardless of the presence of gene duplication and loss and ILS therefore without the need to identify orthologs or to use a single individual per species. We do this by extending the idea of Maximum Likelihood (ML) supertrees to a hierarchical Bayesian model where several sources of gene tree/species tree disagreement can be accounted for in a modular manner. We implemented this model in a computer program called guenomu whose inputs are posterior distributions of unrooted gene tree topologies for multiple gene families, and whose output is the posterior distribution of rooted species tree topologies. We conducted extensive simulations to evaluate the performance of our approach in comparison with other species tree approaches able to deal with more than one leaf from the same species. Our method ranked best under simulated data sets, in spite of ignoring branch lengths, and performed well on empirical data, as well as being fast enough to analyze relatively large data sets. Our Bayesian supertree method was also very successful in obtaining better estimates of gene trees, by reducing the uncertainty in their distributions. In addition, our results show that under complex simulation scenarios, gene tree parsimony is also a competitive approach once we consider its speed, in contrast to more sophisticated models. © The Author(s) 2014. Published by Oxford University Press on behalf of the Society of Systematic Biologists.

  5. Mandible reconstruction: History, state of the art and persistent problems.

    Science.gov (United States)

    Ferreira, José J; Zagalo, Carlos M; Oliveira, Marta L; Correia, André M; Reis, Ana R

    2015-06-01

    Mandibular reconstruction has been experiencing an amazing evolution. Several different approaches are used to reconstruct this bone and therefore have a fundamental role in the recovery of oral functions. This review aims to highlight the persistent problems associated with the approaches identified, whether bone grafts or prosthetic devices are used. A brief summary of the historical evolution of the surgical procedures is presented, as well as an insight into possible future pathways. A literature review was conducted from September to December 2012 using the PubMed database. The keyword used was "mandible reconstruction." Articles published in the last three years were included as well as the relevant references from those articles and the "historical articles" were referred. This research resulted in a monograph that this article aims to summarize. Titanium plates, bone grafts, pediculate flaps, free osteomyocutaneous flaps, rapid prototyping, and tissue engineering strategies are some of the identified possibilities. The classical approaches present considerable associated morbidity donor-site-related problems. Research that results in the development of new prosthetics devices is needed. A new prosthetic approach could minimize the identified problems and offer the patients more predictable, affordable, and comfortable solutions. This review, while affirming the evolution and the good results found with the actual approaches, emphasizes the negative aspects that still subsist. Thus, it shows that mandible reconstruction is not a closed issue. On the contrary, it remains as a research field where new findings could have a direct positive impact on patients' life quality. The identification of the persistent problems reveals the characteristics to be considered in a new prosthetic device. This could overcome the current difficulties and result in more comfortable solutions. Medical teams have the responsibility to keep patients informed about the predictable

  6. Mitochondrial phylogeny of the Chrysisignita (Hymenoptera: Chrysididae) species group based on simultaneous Bayesian alignment and phylogeny reconstruction.

    Science.gov (United States)

    Soon, Villu; Saarma, Urmas

    2011-07-01

    The ignita species group within the genus Chrysis includes over 100 cuckoo wasp species, which all lead a parasitic lifestyle and exhibit very similar morphology. The lack of robust, diagnostic morphological characters has hindered phylogenetic reconstructions and contributed to frequent misidentification and inconsistent interpretations of species in this group. Therefore, molecular phylogenetic analysis is the most suitable approach for resolving the phylogeny and taxonomy of this group. We present a well-resolved phylogeny of the Chrysis ignita species group based on mitochondrial sequence data from 41 ingroup and six outgroup taxa. Although our emphasis was on European taxa, we included samples from most of the distribution range of the C. ignita species group to test for monophyly. We used a continuous mitochondrial DNA sequence consisting of 16S rRNA, tRNA(Val), 12S rRNA and ND4. The location of the ND4 gene at the 3' end of this continuous sequence, following 12S rRNA, represents a novel mitochondrial gene arrangement for insects. Due to difficulties in aligning rRNA genes, two different Bayesian approaches were employed to reconstruct phylogeny: (1) using a reduced data matrix including only those positions that could be aligned with confidence; or (2) using the full sequence dataset while estimating alignment and phylogeny simultaneously. In addition maximum-parsimony and maximum-likelihood analyses were performed to test the robustness of the Bayesian approaches. Although all approaches yielded trees with similar topology, considerably more nodes were resolved with analyses using the full data matrix. Phylogenetic analysis supported the monophyly of the C. ignita species group and divided its species into well-supported clades. The resultant phylogeny was only partly in accordance with published subgroupings based on morphology. Our results suggest that several taxa currently treated as subspecies or names treated as synonyms may in fact constitute

  7. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739

  8. Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M

    2012-11-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.

  9. Bayesian Framework with Non-local and Low-rank Constraint for Image Reconstruction

    Science.gov (United States)

    Tang, Zhonghe; Wang, Shengzhe; Huo, Jianliang; Guo, Hang; Zhao, Haibo; Mei, Yuan

    2017-01-01

    Built upon the similar methodology of 'grouping and collaboratively filtering', the proposed algorithm recovers image patches from the array of similar noisy patches based on the assumption that their noise-free versions or approximation lie in a low dimensional subspace and has a low rank. Based on the analysis of the effect of noise and perturbation on the singular value, a weighted nuclear norm is defined to replace the conventional nuclear norm. Corresponding low-rank decomposition model and singular value shrinkage operator are derived. Taking into account the difference between the distribution of the signal and the noise, the weight depends not only on the standard deviation of noise, but also on the rank of the noise-free matrix and the singular value itself. Experimental results in image reconstruction tasks show that at relatively low computational cost the performance of proposed method is very close to state-of-the-art reconstruction methods BM3D and LSSC even outperforms them in restoring and preserving structure

  10. Diet reconstruction and resource partitioning of a Caribbean marine mesopredator using stable isotope bayesian modelling.

    Directory of Open Access Journals (Sweden)

    Alexander Tilley

    Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.

  11. Does Smoking History Confer a Higher Risk for Reconstructive Complications in Nipple-Sparing Mastectomy?

    Science.gov (United States)

    Frey, Jordan D; Alperovich, Michael; Levine, Jamie P; Choi, Mihye; Karp, Nolan S

    2017-07-01

    History of smoking has been implicated as a risk factor for reconstructive complications in nipple-sparing mastectomy (NSM), however there have been no direct analyses of outcomes in smokers and nonsmokers. All patients undergoing NSM at New York University Langone Medical Center from 2006 to 2014 were identified. Outcomes were compared for those with and without a smoking history and stratified by pack-year smoking history and years-to-quitting (YTQ). A total of 543 nipple-sparing mastectomies were performed from 2006 to 2014 with a total of 49 in patients with a history of smoking. Reconstructive outcomes in NSM between those with and without a smoking history were equivalent. Those with a smoking history were not significantly more likely to have mastectomy flap necrosis (p = 0.6251), partial (p = 0.8564), or complete (p = 0.3365) nipple-areola complex (NAC) necrosis. Likewise, active smokers alone did not have a higher risk of complications compared to nonsmokers or those with smoking history. Comparing nonsmokers and those with a less or greater than 10 pack-year smoking history, those with a > 10 pack-year history had significantly more complete NAC necrosis (p = 0.0114, smoking history or >5 YTQ prior to NSM were equivalent to those without a smoking history. We demonstrate that NSM may be safely offered to those with a smoking history although a > 10 pack-year smoking history or <5 YTQ prior to NSM may impart a higher risk of reconstructive complications, including complete NAC necrosis. © 2017 Wiley Periodicals, Inc.

  12. Reconstructing a School's Past Using Oral Histories and GIS Mapping.

    Science.gov (United States)

    Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna

    2000-01-01

    Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)

  13. Bayesian Reconstruction of Disease Outbreaks by Combining Epidemiologic and Genomic Data

    Science.gov (United States)

    Jombart, Thibaut; Cori, Anne; Didelot, Xavier; Cauchemez, Simon; Fraser, Christophe; Ferguson, Neil

    2014-01-01

    Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees (“who infected whom”) from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS), providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments. PMID:24465202

  14. Reconstructing the transport history of pebbles on Mars

    Science.gov (United States)

    Szabó, Tímea; Domokos, Gábor; Grotzinger, John P.; Jerolmack, Douglas J.

    2015-01-01

    The discovery of remarkably rounded pebbles by the rover Curiosity, within an exhumed alluvial fan complex in Gale Crater, presents some of the most compelling evidence yet for sustained fluvial activity on Mars. While rounding is known to result from abrasion by inter-particle collisions, geologic interpretations of sediment shape have been qualitative. Here we show how quantitative information on the transport distance of river pebbles can be extracted from their shape alone, using a combination of theory, laboratory experiments and terrestrial field data. We determine that the Martian basalt pebbles have been carried tens of kilometres from their source, by bed-load transport on an alluvial fan. In contrast, angular clasts strewn about the surface of the Curiosity traverse are indicative of later emplacement by rock fragmentation processes. The proposed method for decoding transport history from particle shape provides a new tool for terrestrial and planetary sedimentology. PMID:26460507

  15. An experimental evaluation of fire history reconstruction using dendrochronology in white oak (Quercus alba)

    Science.gov (United States)

    Ryan W. McEwan; Todd F. Hutchinson; Robert D. Ford; Brian C. McCarthy

    2007-01-01

    Dendrochronological analysis of fire scars on tree cross sections has been critically important for understanding historical fire regimes and has influenced forest management practices. Despite its value as a tool for understanding historical ecosystems, tree-ring-based fire history reconstruction has rarely been experimentally evaluated. To examine the efficacy of...

  16. Reconstructing the trophic history of the Black Sea shelf

    Science.gov (United States)

    Yunev, Oleg; Velikova, Violeta; Carstensen, Jacob

    2017-11-01

    In the last 50 years the Black Sea has undergone large changes driven by increasing anthropogenic pressures. We estimated the integrated annual primary production (APP) for different shelf regions during the early eutrophication phase (1963-1976) using chlorophyll a and winter nitrate concentrations as proxy observations of primary production to describe its seasonal variation. For comparison, APP was estimated during the period when eutrophication peaked (1985-1992). In the early eutrophication period APP was estimated at 64-89 g C m-2 yr-1 for most part of the shelf, except the shelf part influenced by the Danube River (the shallow waters off the Romanian and Bulgarian coasts) where APP was ∼126 g C m-2 yr-1. In these two different shelf parts, APP increased to 138-190 and 266-318 g C m-2 yr-1 during the peak eutrophication period. These spatial differences are attributed to the large nutrient inputs from the Danube River. The APP estimates provide new insight into the eutrophication history of the Black Sea shelf, documenting stronger signs of eutrophiction than observed in other enclosed seas such as the Baltic Sea. Since the peak eutrophication period APP is estimated to have decreased by approximately 15-20%.

  17. Improved MRI Reconstruction From Reduced Scans K-Space by Integrating Neural Priors in the Bayesian Restoration

    National Research Council Canada - National Science Library

    Reczko, M

    2001-01-01

    ...) from reduced scans in k-space. The proposed approach considers the combined use of Neural Network models and Bayesian restoration, in the problem of MRI image extraction from sparsely sampled k-space, following several different...

  18. Multichannel Signals Reconstruction Based on Tunable Q-Factor Wavelet Transform-Morphological Component Analysis and Sparse Bayesian Iteration for Rotating Machines

    Directory of Open Access Journals (Sweden)

    Qing Li

    2018-04-01

    Full Text Available High-speed remote transmission and large-capacity data storage are difficult issues in signals acquisition of rotating machines condition monitoring. To address these concerns, a novel multichannel signals reconstruction approach based on tunable Q-factor wavelet transform-morphological component analysis (TQWT-MCA and sparse Bayesian iteration algorithm combined with step-impulse dictionary is proposed under the frame of compressed sensing (CS. To begin with, to prevent the periodical impulses loss and effectively separate periodical impulses from the external noise and additive interference components, the TQWT-MCA method is introduced to divide the raw vibration signal into low-resonance component (LRC, i.e., periodical impulses and high-resonance component (HRC, thus, the periodical impulses are preserved effectively. Then, according to the amplitude range of generated LRC, the step-impulse dictionary atom is designed to match the physical structure of periodical impulses. Furthermore, the periodical impulses and HRC are reconstructed by the sparse Bayesian iteration combined with step-impulse dictionary, respectively, finally, the final reconstructed raw signals are obtained by adding the LRC and HRC, meanwhile, the fidelity of the final reconstructed signals is tested by the envelop spectrum and error analysis, respectively. In this work, the proposed algorithm is applied to simulated signal and engineering multichannel signals of a gearbox with multiple faults. Experimental results demonstrate that the proposed approach significantly improves the reconstructive accuracy compared with the state-of-the-art methods such as non-convex Lq (q = 0.5 regularization, spatiotemporal sparse Bayesian learning (SSBL and L1-norm, etc. Additionally, the processing time, i.e., speed of storage and transmission has increased dramatically, more importantly, the fault characteristics of the gearbox with multiple faults are detected and saved, i.e., the

  19. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.

    Science.gov (United States)

    Magee, Daniel; Suchard, Marc A; Scotch, Matthew

    2017-02-01

    Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS) framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM) of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014-15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation.

  20. Bayesian phylogeography of influenza A/H3N2 for the 2014-15 season in the United States using three frameworks of ancestral state reconstruction.

    Directory of Open Access Journals (Sweden)

    Daniel Magee

    2017-02-01

    Full Text Available Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014-15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation.

  1. Memory, History and Narrative: Shifts of Meaning when (Reconstructing the Past

    Directory of Open Access Journals (Sweden)

    Ignacio Brescó de Luna

    2012-05-01

    Full Text Available This paper is devoted to the examination of some socio-cultural dimensions of memory, focusing on narratives as a meditational tool (Vygotsky, 1978 for the construction of past events and attribution of meaning. The five elements of Kenneth Burke’s Grammar of Motives (1969 are taken as a framework for the examination of reconstructions of the past and particularly of histories, namely: 1 the interpretative and reconstructive action of 2 a positioned agent operating 3 through narrative means 4 addressed to particular purposes 5 within a concrete social and temporal scenery. The reflexive character of such approach opens the ground for considering remembering as one kind of act performed within the context of a set of on-going actions, so that remembrances play a directive role for action and so have an unavoidable moral dimension. This is particularly relevant for some kinds of social memory such as history teaching and their effects upon identity.

  2. Equivalent thermal history reconstruction from a partially crystallized glass-ceramic sensor array

    Science.gov (United States)

    Heeg, Bauke

    2015-11-01

    The basic concept of a thermal history sensor is that it records the accumulated exposure to some unknown, typically varying temperature profile for a certain amount of time. Such a sensor is considered to be capable of measuring the duration of several (N) temperature intervals. For this purpose, the sensor deploys multiple (M) sensing elements, each with different temperature sensitivity. At the end of some thermal exposure for a known period of time, the sensor array is read-out and an estimate is made of the set of N durations of the different temperature ranges. A potential implementation of such a sensor was pioneered by Fair et al. [Sens. Actuators, A 141, 245 (2008)], based on glass-ceramic materials with different temperature-dependent crystallization dynamics. In their work, it was demonstrated that an array of sensor elements can be made sensitive to slight differences in temperature history. Further, a forward crystallization model was used to simulate the variations in sensor array response to differences in the temperature history. The current paper focusses on the inverse aspect of temperature history reconstruction from a hypothetical sensor array output. The goal of such a reconstruction is to find an equivalent thermal history that is the closest representation of the true thermal history, i.e., the durations of a set of temperature intervals that result in a set of fractional crystallization values which is closest to the one resulting from the true thermal history. One particular useful simplification in both the sensor model as well as in its practical implementation is the omission of nucleation effects. In that case, least squares models can be used to approximate the sensor response and make reconstruction estimates. Even with this simplification, sensor noise can have a destabilizing effect on possible reconstruction solutions, which is evaluated using simulations. Both regularization and non-negativity constrained least squares

  3. Reconstruction of the early invasion history of the quagga mussel (Dreissena rostriformis bugensis) in Western Europe

    OpenAIRE

    Heiler, Katharina; Vaate, Abraham bij de; Ekschmitt, Klemens; Oheimb, Parm von; Albrecht, Christian; Wilke, Thomas

    2013-01-01

    The recent introduction of the quagga mussel into Western European freshwaters marked the beginning of one of the most successful biological invasions during the past years in this region. However, the spatial and temporal origin of the first invasive population(s) in Western Europe as well as subsequent spreading routes still remain under discussion. In this study, we therefore aim at reconstructing the early invasion history of the quagga mussel in Western Europe based on an age-corrected t...

  4. Comparison of Bayesian penalized likelihood reconstruction versus OS-EM for characterization of small pulmonary nodules in oncologic PET/CT.

    Science.gov (United States)

    Howard, Brandon A; Morgan, Rustain; Thorpe, Matthew P; Turkington, Timothy G; Oldan, Jorge; James, Olga G; Borges-Neto, Salvador

    2017-10-01

    To determine whether the recently introduced Bayesian penalized likelihood PET reconstruction (Q.Clear) increases the visual conspicuity and SUV max of small pulmonary nodules near the PET resolution limit, relative to ordered subset expectation maximization (OS-EM). In this institutional review board-approved and HIPAA-compliant study, 29 FDG PET/CT scans performed on a five-ring GE Discovery IQ were retrospectively selected for pulmonary nodules described in the radiologist's report as "too small to characterize", or small lung nodules in patients at high risk for lung cancer. Thirty-two pulmonary nodules were assessed, with mean CT diameter of 8 mm (range 2-18). PET images were reconstructed with OS-EM and Q.Clear with noise penalty strength β values of 150, 250, and 350. Lesion visual conspicuity was scored by three readers on a 3-point scale, and lesion SUV max and background liver and blood pool SUV mean and SUV stdev were recorded. Comparison was made by linear mixed model with modified Bonferroni post hoc testing; significance cutoff was p OS-EM at β = 150 (p OS-EM at β = 150 and 250 (p OS-EM reconstruction, but only with low noise penalization. Q.Clear with β = 150 may be advantageous when evaluation of small pulmonary nodules is of primary concern.

  5. Bayesian prediction and adaptive sampling algorithms for mobile sensor networks online environmental field reconstruction in space and time

    CERN Document Server

    Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata

    2016-01-01

    This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...

  6. Isotopic reconstruction of the weaning process in the archaeological population of Canímar Abajo, Cuba: A Bayesian probability mixing model approach.

    Directory of Open Access Journals (Sweden)

    Yadira Chinique de Armas

    Full Text Available The general lack of well-preserved juvenile skeletal remains from Caribbean archaeological sites has, in the past, prevented evaluations of juvenile dietary changes. Canímar Abajo (Cuba, with a large number of well-preserved juvenile and adult skeletal remains, provided a unique opportunity to fully assess juvenile paleodiets from an ancient Caribbean population. Ages for the start and the end of weaning and possible food sources used for weaning were inferred by combining the results of two Bayesian probability models that help to reduce some of the uncertainties inherent to bone collagen isotope based paleodiet reconstructions. Bone collagen (31 juveniles, 18 adult females was used for carbon and nitrogen isotope analyses. The isotope results were assessed using two Bayesian probability models: Weaning Ages Reconstruction with Nitrogen isotopes and Stable Isotope Analyses in R. Breast milk seems to have been the most important protein source until two years of age with some supplementary food such as tropical fruits and root cultigens likely introduced earlier. After two, juvenile diets were likely continuously supplemented by starch rich foods such as root cultigens and legumes. By the age of three, the model results suggest that the weaning process was completed. Additional indications suggest that animal marine/riverine protein and maize, while part of the Canímar Abajo female diets, were likely not used to supplement juvenile diets. The combined use of both models here provided a more complete assessment of the weaning process for an ancient Caribbean population, indicating not only the start and end ages of weaning but also the relative importance of different food sources for different age juveniles.

  7. Reconstructing Atmospheric Histories of Halogenated Compounds to Preindustrial Times Using Antarctic Firn Air

    Science.gov (United States)

    Shields, J. E.; Mühle, J.; Severinghaus, J. P.; Weiss, R. F.

    2007-12-01

    Atmospheric histories of many halogenated trace gases remain poorly known, hampering understanding of lifetimes and anthropogenic impacts. A profile of air samples dating back to the late 19th century was collected from the firn at the Megadunes site in central Antarctica (80.78° S, 124.5° E) in January 2004. A number of anthropogenic halogenated compounds were measured in these samples using the AGAGE Medusa gas chromatograph-mass spectrometer instrumentation (B. R. Miller et al., in preparation). A firn gas-diffusion forward model based on the work of Schwander et al. (1993) was tuned to CO2 and 15N observations from the same Megadunes site. The age distribution of CO2 in diffusively mixed air samples collected at each depth was approximated by running short pulses through the forward model. The atmospheric histories of a number of halogenated compounds were then reconstructed using the iterative dating technique developed by Trudinger et al. (2002). The modeled age spread at this site is relatively broad, but interstitial air at the close-off zone is comparatively old with a mean age of about 100 years. Reconstructed histories show good agreement with direct measurements, although rapid changes are not well resolved. The mixing ratios of the deepest layer are within the range of preindustrial estimates, most notably for tetrafluoromethane. Schwander, J., J. M. Barnola, C. Andrie, M. Leuenberger, A. Ludin, D. Raynaud, B. Stauffer (1993). The Age of the Air in the Firn and the Ice at Summit, Greenland. J. Geophys. Res. 98(D2): 2831-2838. Trudinger, C. M., D. M. Etheridge, G. A. Sturrock, P. J. Fraser, P. B. Krummel, and A. McCulloch (2004). Atmospheric histories of halocarbons from analysis of Antarctic firn air: Methyl bromide, methyl chloride, chloroform, and dichloromethane. J. Geophys. Res. 109(D22310): doi:10.1029/2004JD004932.

  8. Inferring Population Size History from Large Samples of Genome-Wide Molecular Data - An Approximate Bayesian Computation Approach.

    Directory of Open Access Journals (Sweden)

    Simon Boitard

    2016-03-01

    Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.

  9. A general reconstruction of the recent expansion history of the universe

    Science.gov (United States)

    Vitenti, S. D. P.; Penna-Lima, M.

    2015-09-01

    Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary (n+1)-knots spline interpolation. We carry out a Monte Carlo (MC) analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and H(z) data. The bias-variance analysis is done for three fiducial models with different features in the deceleration curve. We perform the MC analysis generating mock catalogs and computing their best-fit. For each fiducial model, we test different reconstructions using, in each case, more than 104 catalogs in a total of about 5× 105. This investigation proved to be essential in determining the best reconstruction to study these data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most 10% of the total uncertainty. In all statistical analyses, we fit the coefficients of the deceleration function along with four nuisance parameters of the supernova astrophysical model. For the full sample, we also fit H0 and the sound horizon rs(zd) at the drag redshift. The bias-variance trade-off analysis shows that, apart from the deceleration function, all other estimators are unbiased. Finally, we apply the Ensemble Sampler Markov Chain Monte Carlo (ESMCMC) method to explore the posterior of the deceleration function up to redshift 1.3 (using only JLA) and 2.3 (JLA

  10. Parametric 3D Atmospheric Reconstruction in Highly Variable Terrain with Recycled Monte Carlo Paths and an Adapted Bayesian Inference Engine

    Science.gov (United States)

    Langmore, Ian; Davis, Anthony B.; Bal, Guillaume; Marzouk, Youssef M.

    2012-01-01

    We describe a method for accelerating a 3D Monte Carlo forward radiative transfer model to the point where it can be used in a new kind of Bayesian retrieval framework. The remote sensing challenge is to detect and quantify a chemical effluent of a known absorbing gas produced by an industrial facility in a deep valley. The available data is a single low resolution noisy image of the scene in the near IR at an absorbing wavelength for the gas of interest. The detected sunlight has been multiply reflected by the variable terrain and/or scattered by an aerosol that is assumed partially known and partially unknown. We thus introduce a new class of remote sensing algorithms best described as "multi-pixel" techniques that call necessarily for a 3D radaitive transfer model (but demonstrated here in 2D); they can be added to conventional ones that exploit typically multi- or hyper-spectral data, sometimes with multi-angle capability, with or without information about polarization. The novel Bayesian inference methodology uses adaptively, with efficiency in mind, the fact that a Monte Carlo forward model has a known and controllable uncertainty depending on the number of sun-to-detector paths used.

  11. Revised reconstructions of the Late Cretaceous to recent history of the Pacific basin

    Science.gov (United States)

    Wright, N.; Seton, M.; Williams, S.; Müller, D.

    2013-12-01

    The tectonic evolution of the Pacific basin since the Late Cretaceous involves a number of major plate reorganisations, including multiple fragmentations of the Farallon plate and the formation of the Vancouver, Nazca and Cocos plates, and the incorporation of the Bellingshausen plate into the Antarctic plate. However, many regional plate kinematic models of the northeast and southeast Pacific are based on data derived before the availability of high-resolution satellite altimetry and accurate global positioning system (GPS) navigation, causing inconsistencies in these models. Such inconsistencies arise from differences in previous magnetic anomaly interpretations and fracture zone identifications from satellite-derived gravity anomalies. Discrepancies in published models of the Nazca-Pacific, Vancouver-Pacific and Farallon-Pacific spreading history results have implications for modelling the convergence history along the South American and North American margins within global plate models, influencing the variations in age, location and geometry of plates at subduction zones. We refine reconstructions of the seafloor spreading history of the south (Pacific, Antarctic, and Bellingshausen plates), southeast (Nazca, Farallon and Pacific plates) and northeast (Vancouver, Farallon and Pacific plates) Pacific basin from the Late Cretaceous (83.5 Ma) to present-day, based on a synthesis of marine magnetic anomaly picks and fracture zone identifications from satellite-derived gravity anomalies. We calculate rotations and uncertainties for plate pairs based on Hellinger's (1981) best fitting criteria. We divide the Farallon plate into two segments: the northern segment (Farallon plate) and southern segment (';Pre-Nazca' plate), and reconstruct separate spreading histories for each segment relative to the Pacific plate, and find large differences in our ';Pre-Nazca'-Pacific history compared to published models for this area. We subsequently investigate the difference in the

  12. Evolutionary History of Assassin Bugs (Insecta: Hemiptera: Reduviidae): Insights from Divergence Dating and Ancestral State Reconstruction

    Science.gov (United States)

    Hwang, Wei Song; Weirauch, Christiane

    2012-01-01

    Assassin bugs are one of the most successful clades of predatory animals based on their species numbers (∼6,800 spp.) and wide distribution in terrestrial ecosystems. Various novel prey capture strategies and remarkable prey specializations contribute to their appeal as a model to study evolutionary pathways involved in predation. Here, we reconstruct the most comprehensive reduviid phylogeny (178 taxa, 18 subfamilies) to date based on molecular data (5 markers). This phylogeny tests current hypotheses on reduviid relationships emphasizing the polyphyletic Reduviinae and the blood-feeding, disease-vectoring Triatominae, and allows us, for the first time in assassin bugs, to reconstruct ancestral states of prey associations and microhabitats. Using a fossil-calibrated molecular tree, we estimated divergence times for key events in the evolutionary history of Reduviidae. Our results indicate that the polyphyletic Reduviinae fall into 11–14 separate clades. Triatominae are paraphyletic with respect to the reduviine genus Opisthacidius in the maximum likelihood analyses; this result is in contrast to prior hypotheses that found Triatominae to be monophyletic or polyphyletic and may be due to the more comprehensive taxon and character sampling in this study. The evolution of blood-feeding may thus have occurred once or twice independently among predatory assassin bugs. All prey specialists evolved from generalist ancestors, with multiple evolutionary origins of termite and ant specializations. A bark-associated life style on tree trunks is ancestral for most of the lineages of Higher Reduviidae; living on foliage has evolved at least six times independently. Reduviidae originated in the Middle Jurassic (178 Ma), but significant lineage diversification only began in the Late Cretaceous (97 Ma). The integration of molecular phylogenetics with fossil and life history data as presented in this paper provides insights into the evolutionary history of reduviids and clears

  13. Archaeology of fire: Methodological aspects of reconstructing fire history of prehistoric archaeological sites

    Science.gov (United States)

    Alperson-Afil, Nira

    2012-07-01

    Concepts which are common in the reconstruction of fire histories are employed here for the purpose of interpreting fires identified at archaeological sites. When attempting to evaluate the fire history of ancient occupations we are limited by the amount and quality of the available data. Furthermore, the identification of archaeological burned materials, such as stone, wood, and charcoal, is adequate for the general assumption of a "fire history", but the agent responsible - anthropogenic or natural - cannot be inferred from the mere presence of burned items. The large body of scientific data that has accumulated, primarily through efforts to prevent future fire disasters, enables us to reconstruct scenarios of past natural fires. Adopting this line of thought, this paper attempts to evaluate the circumstances in which a natural fire may have ignited and spread at the 0.79 Ma occupation site of Gesher Benot Ya'aqov (Israel), resulting with burned wood and burned flint within the archaeological layers. At Gesher Benot Ya'aqov, possible remnants of hearths are explored through analyses of the spatial distribution of burned flint-knapping waste products. These occur in dense clusters in each of the archaeological occupations throughout the long stratigraphic sequence. In this study, the combination between the spatial analyses results, paleoenvironmental information, and various factors involved in the complex process of fire ignition, combustion, and behavior, has enabled the firm rejection of recurrent natural fires as the responsible agent for the burned materials. In addition, it suggested that mainly at early sites, where evidence for burning is present yet scarce, data on fire ecology can be particularly useful when it is considered in relation to paleoenvironmental information.

  14. Bayesian Penalized Likelihood Image Reconstruction (Q.Clear) in 82Rb Cardiac PET: Impact of Count Statistics

    DEFF Research Database (Denmark)

    Christensen, Nana Louise; Tolbod, Lars Poulsen

    PET scans. 3) Static and dynamic images from a set of 7 patients (BSA: 1.6-2.2 m2) referred for 82Rb cardiac PET was analyzed using a range of beta factors. Results were compared to the institution’s standard clinical practice reconstruction protocol. All scans were performed on GE DMI Digital-Ready...

  15. The history of the Society of Urodynamics, Female Pelvic Medicine, and Urogenital Reconstruction.

    Science.gov (United States)

    Weissbart, Steven J; Zimmern, Philippe E; Nitti, Victor W; Lemack, Gary E; Kobashi, Kathleen C; Vasavada, Sandip P; Wein, Alan J

    2018-03-25

    To review the history of the Society of Urodynamics, Female Pelvic Medicine and Urogenital Reconstruction (SUFU). We reviewed Society meeting minutes, contacted all living former Society presidents, searched the William P. Didusch Center for Urology History records, and asked Society members to share their important Society experiences in order to gather important historical information about the Society. The Society initially formed as the Urodynamics Society in 1969 in the backdrop of a growing passion for scientific research in the country after World War II ended. Since then, Society meetings have provided a pivotal forum for the advancement of science in lower urinary tract dysfunction. Meetings occurred annually until 2004, when the meeting schedule increased to biannual. The journal, Neurourology and Urodynamics, became the official journal of the Society in 2005. SUFU has authored important guidelines on urodynamics (2012), non-neurogenic overactive bladder (2012), and stress urinary incontinence (2017) and has shared important collaborations with other societies, including the American Urological Association (AUA), the International Continence Society (ICS), and the International Society of Pelvic Neuromodulation (ISPiN). SUFU has also been instrumental in trainee education and helped to establish formal fellowship training in the field in addition to holding a yearly educational meeting for urology residents. The Society has been led by 21 presidents throughout its history. Throughout the Society's near half-century long existence, the Society has fostered research, published guidelines, and educated trainees in order to improve the care of individuals suffering from lower urinary tract dysfunction. © 2018 Wiley Periodicals, Inc.

  16. The ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) Project brings history to life!

    Science.gov (United States)

    Avanzo, Salvatore; Barbera, Roberto; de Mattia, Francesco; Rocca, Giuseppe La; Sorrentino, Mariapaola; Vicinanza, Domenico

    ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) is a project coordinated at Conservatory of Music of Parma which aims to bring history to life. Ancient musical instruments can now be heard for the first time in hundreds of years, thanks to the successful synergy between art/humanities and science. The Epigonion, an instrument of the past, has been digitally recreated using gLite, an advanced middleware developed in the context of the EGEE project and research networks such as GÉANT2 in Europe and EUMEDCONNECT2 in the Mediterranean region. GÉANT2 and EUMEDCONNECT2, by connecting enormous and heterogeneous computing resources, provided the needed infrastructures to speed up the overall computation time and enable the computer-intensive modeling of musical sounds. This paper summarizes the most recent outcomes of the project underlining how the Grid aspect of the computation can support the Cultural Heritage community.

  17. Backlash against American psychology: an indigenous reconstruction of the history of German critical psychology.

    Science.gov (United States)

    Teo, Thomas

    2013-02-01

    After suggesting that all psychologies contain indigenous qualities and discussing differences and commonalities between German and North American historiographies of psychology, an indigenous reconstruction of German critical psychology is applied. It is argued that German critical psychology can be understood as a backlash against American psychology, as a response to the Americanization of German psychology after WWII, on the background of the history of German psychology, the academic impact of the Cold War, and the trajectory of personal biographies and institutions. Using an intellectual-historical perspective, it is shown how and which indigenous dimensions played a role in the development of German critical psychology as well as the limitations to such an historical approach. Expanding from German critical psychology, the role of the critique of American psychology in various contexts around the globe is discussed in order to emphasize the relevance of indigenous historical research.

  18. Reconstructing the history of eutrophication and quantifying total nitrogen reference conditions in Bothnian Sea coastal waters

    Science.gov (United States)

    Andrén, Elinor; Telford, Richard J.; Jonsson, Per

    2017-11-01

    Reference total nitrogen (TN) concentrations for the Gårdsfjärden estuary in the central Bothnian Sea, which receives discharge from an industrial point-source, have been estimated from diatom assemblages using a transfer function. Sedimentological and diatom evidence imply a good ecological status before 1920 with an assemblage dominated by benthic taxa indicating excellent water transparency, high diatom species richness and less organic sedimentation resulting in homogeneous well oxygenated sediments. A change in the diatom assemblage starts between 1920 and 1935 when the species richness declines and the proportion of planktic taxa increases. Increased organic carbon sedimentation after 1920 led to hypoxic bottom waters, and the preservation of laminae in the sediments. The trend in the reconstructed TN-values agrees with the history of the discharge from the mill, reaching maximum impact during the high discharge between 1945 and 1990. The background condition for TN in Gårdsfjärden is 260-300 μg L-1, reconstructed until 1920.

  19. Evolutionary history of versatile-lipases from Agaricales through reconstruction of ancestral structures.

    Science.gov (United States)

    Barriuso, Jorge; Martínez, María Jesús

    2017-01-03

    Fungal "Versatile carboxylic ester hydrolases" are enzymes with great biotechnological interest. Here we carried out a bioinformatic screening to find these proteins in genomes from Agaricales, by means of searching for conserved motifs, sequence and phylogenetic analysis, and three-dimensional modeling. Moreover, we reconstructed the molecular evolution of these enzymes along the time by inferring and analyzing the sequence of ancestral intermediate forms. The properties of the ancestral candidates are discussed on the basis of their three-dimensional structural models, the hydrophobicity of the lid, and the substrate binding intramolecular tunnel, revealing all of them featured properties of these enzymes. The evolutionary history of the putative lipases revealed an increase on the length and hydrophobicity of the lid region, as well as in the size of the substrate binding pocket, during evolution time. These facts suggest the enzymes' specialization towards certain substrates and their subsequent loss of promiscuity. These results bring to light the presence of different pools of lipases in fungi with different habitats and life styles. Despite the consistency of the data gathered from reconstruction of ancestral sequences, the heterologous expression of some of these candidates would be essential to corroborate enzymes' activities.

  20. Gene regulatory network reconstruction using Bayesian networks, the Dantzig Selector, the Lasso and their meta-analysis.

    Directory of Open Access Journals (Sweden)

    Matthieu Vignes

    Full Text Available Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth "Dialogue for Reverse Engineering Assessments and Methods" (DREAM5 challenges are aimed at assessing methods and associated algorithms devoted to the inference of biological networks. Challenge 3 on "Systems Genetics" proposed to infer causal gene regulatory networks from different genetical genomics data sets. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis of predicted networks in terms of structure and reliability. The developed meta-analysis was ranked first among the 16 teams participating in Challenge 3A. It paves the way for future extensions of our inference method and more accurate gene network estimates in the context of genetical genomics.

  1. Reconstructions of human history by mapping dental markers in living Eurasian populations

    Science.gov (United States)

    Kashibadze, Vera F.; Nasonova, Olga G.; Nasonov, Dmitry S.

    2013-01-01

    Using advances in gene geography and anthropophenetics, the phenogeographical method for anthropological research was initiated and developed using dental data. Statistical and cartographical analyses are provided for 498 living Eurasian populations. Mapping principal components supplied evidence for the phene pool structure in Eurasian populations, and for reconstructions of Homo sapiens history on the continent. Longitudinal variability seems to be the most important regularity revealed by principal components analysis (PCA) and mapping, indicating the division of the whole area into western and eastern main provinces. So, the most ancient scenario in the history of Eurasian populations developed from two perspective different groups: a western group related to ancient populations of West Asia and an eastern one rooted in ancestry in South and/or East Asia. In spite of the enormous territory and the revealed divergence, the populations of the continent have undergone wide scale and intensive timeespace interaction. Many details in the revealed landscapes are background to different historical events. Migrations and assimilation are two essential phenomena in Eurasian history: the widespread of the western combination through the whole continent to the Pacific coastline and the movement of the paradoxical combinations of eastern and western markers from South or Central Asia to the east and west. Taking into account that no additional eastern combinations in the total variation in Asian groups have been found, but that mixed or western markers' sets and that eastern dental characteristics are traced in Asia since Homo erectus, the assumption is made in favour of the hetero-level assimilation in the eastern province and of net-like evolution of H. sapiens.

  2. Contamination History of Lead and Other Trace Metals Reconstructed from an Urban Winter Pond in the Eastern Mediterranean Coast (Israel)

    NARCIS (Netherlands)

    Zohar, I.; Bookman, R.; Levin, N.; de Stigter, H.; Teutsch, N.

    2014-01-01

    Pollution history of Pb and other trace metals was reconstructed for the first time for the Eastern Mediterranean, from a small urban winter pond (Dora, Netanya), located at the densely populated coastal plain of Israel. An integrated approach including geochemical, sedimentological, and historical

  3. Reconstructing the past outburst history of Eta Carinae from WFPC2 proper motions

    Science.gov (United States)

    Smith, Nathan

    2016-10-01

    The HST archive contains multiple epochs of WFPC2 images of the nebula around Eta Carinae taken over a 15-year timespan, although only the earliest few years of data have been analyzed and published. The fact that all these images were taken with the same instrument, with the same pixel sampling and field distortion, makes them an invaluable resource for accurately measuring the expanding ejecta. The goal of a previously accepted AR proposal was to analyze the full set of appropriate continuum-filter HST images to place precise constraints on the avereage ejection date of the Homunculus Nebula; this analysis is now complete (Smith et al 2016) and the nebula appears to have been ejected in the second half of 1847. Here we propose to continue this project by constraining the motion of the more extended and much older Outer Ejecta around Eta Carinae. Older material outside the main bipolar nebula traces previous major outbursts of the star with no recorded historical observations. We propose an ambitious reduction and analysis of the complete WFPC2 imaging dataset of Eta Car. These data can reconstruct its violent mass-loss history over the past thousand years. We have already started this by analyzing two epochs of ACS F658N images, and astonishingly, these data suggested two previous eruptions in the 13th and 15th centuries assuming ballistic motion. WFPC2 images will extend the baseline by 10 yr, and critically, more than 2 epochs allow us to measure any deceleration in the ejecta. We will also analyze Doppler shifts in ground-based spectra in order to reconstruct the 3D geometry of past mass ejection. This AR proposal will fund the final year of a PhD thesis.

  4. Reconstructing a multi-centennial drought history for the British Isles

    Science.gov (United States)

    Macdonald, Neil; Chiverrell, Richard; Todd, Beverley; Bowen, James; Lennard, Amy

    2016-04-01

    The last two decades have witnessed some of the most severe droughts experienced within living memory in the UK, but have these droughts really been exceptional? Relatively few instrumental river flow, groundwater or reservoir series extend beyond 50 years in length, with few precipitation series currently available extending over 100 years. These relatively short series present considerable challenges in determining current and future drought risk, with the results affecting society and the economy. This study uses long instrumental precipitation series coupled with the SPI and scPDSi drought indices to reconstruct drought histories from different parts of the British Isles. Existing long precipitation series have been reassessed and several new precipitation series reconstructed (e.g. Carlisle 1757), with eight series now over 200 years in length, and a further thirteen over 150 years, with further sites currently being developed (e.g. Norwich, 1749-; Exeter, 1724-). This study will focuses on the eight longest series, with shorter series used to help explore spatial and temporal variability across British Isles. We show how historical series have improved understanding of severe droughts, by examining past spatial and temporal variability and exploring the climatic drivers responsible. It shows that recent well documented droughts (e.g. 1976; 1996 and 2010) which have shaped both public and water resource managers' perceptions of risk, have historically been exceeded in both severity (e.g. 1781) and duration (e.g. 1798-1810); with the largest droughts often transcending single catchments and affecting regions. Recent droughts are not exceptional when considered within a multi-centennial timescale, with improved understanding of historical events raising concerns in contemporary water resource management.

  5. Bayesian analysis in plant pathology.

    Science.gov (United States)

    Mila, A L; Carriquiry, A L

    2004-09-01

    ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.

  6. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  7. RECONSTRUCTING THE PHOTOMETRIC LIGHT CURVES OF EARTH AS A PLANET ALONG ITS HISTORY

    International Nuclear Information System (INIS)

    Sanromá, E.; Pallé, E.

    2012-01-01

    By utilizing satellite-based estimations of the distribution of clouds, we have studied Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation, and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the landmass distributions were different from today's. With this information, we have been able to simulate the globally integrated photometric variability of the planet at these epochs. We find that our simple model reproduces well the observed cloud distribution and albedo variability of the modern Earth. Moreover, the model suggests that the photometric variability of the Earth was probably much larger in past epochs. This enhanced photometric variability could improve the chances for the difficult determination of the rotational period and the identification of continental landmasses for a distant planets.

  8. Reconstructing the complex evolutionary history of mobile plasmids in red algal genomes

    Science.gov (United States)

    Lee, JunMo; Kim, Kyeong Mi; Yang, Eun Chan; Miller, Kathy Ann; Boo, Sung Min; Bhattacharya, Debashish; Yoon, Hwan Su

    2016-01-01

    The integration of foreign DNA into algal and plant plastid genomes is a rare event, with only a few known examples of horizontal gene transfer (HGT). Plasmids, which are well-studied drivers of HGT in prokaryotes, have been reported previously in red algae (Rhodophyta). However, the distribution of these mobile DNA elements and their sites of integration into the plastid (ptDNA), mitochondrial (mtDNA), and nuclear genomes of Rhodophyta remain unknown. Here we reconstructed the complex evolutionary history of plasmid-derived DNAs in red algae. Comparative analysis of 21 rhodophyte ptDNAs, including new genome data for 5 species, turned up 22 plasmid-derived open reading frames (ORFs) that showed syntenic and copy number variation among species, but were conserved within different individuals in three lineages. Several plasmid-derived homologs were found not only in ptDNA but also in mtDNA and in the nuclear genome of green plants, stramenopiles, and rhizarians. Phylogenetic and plasmid-derived ORF analyses showed that the majority of plasmid DNAs originated within red algae, whereas others were derived from cyanobacteria, other bacteria, and viruses. Our results elucidate the evolution of plasmid DNAs in red algae and suggest that they spread as parasitic genetic elements. This hypothesis is consistent with their sporadic distribution within Rhodophyta. PMID:27030297

  9. Reconstructing the colonization history of lost wolf lineages by the analysis of the mitochondrial genome.

    Science.gov (United States)

    Matsumura, Shuichi; Inoshima, Yasuo; Ishiguro, Naotaka

    2014-11-01

    The grey wolves (Canis lupus) originally inhabited major parts of the Northern hemisphere, but many local populations became extinct. Two lineages of wolves in Japan, namely, Japanese or Honshu (C. l. hodophilax) and Ezo or Hokkaido (C. l. hattai) wolves, rapidly went extinct between 100 and 120years ago. Here we analyse the complete mitochondrial genome sequences from ancient specimens and reconstruct the colonization history of the two extinct subspecies. We show a unique status of Japanese wolves in wolf phylogeny, suggesting their long time separation from other grey wolf populations. Japanese wolves appeared to have colonized the Japanese archipelago in the Late Pleistocene (ca. 25,000-125,000years ago). By contrast, Ezo wolves, which are clearly separated from Japanese wolves in phylogeny, are likely to have arrived at Japan relatively recently (wolf populations in Europe and America during the last several millennia. Our analyses suggest that at least several thousands of wolves once inhabited in the Japanese archipelago. Our analyses also show that an enigmatic clade of domestic dogs is likely to have originated from rare admixture events between male dogs and female Japanese wolves. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. Ending the history of silence: reconstructing European Slave trading in the Indian Ocean

    Directory of Open Access Journals (Sweden)

    Richard B. Allen

    Full Text Available Abstract: Thirty-eight years ago, Hubert Gerbeau discussed the problems that contributed to the “history of silence” surrounding slave trading in the Indian Ocean. While the publication of an expanding body of scholarship since the late 1980s demonstrates that this silence is not as deafening as it once was, our knowledge and understanding of this traffic in chattel labor remains far from complete. This article discusses the problems surrounding attempts to reconstruct European slave trading in the Indian Ocean between 1500 and 1850. Recently created inventories of British East India Company slaving voyages during the seventeenth and eighteenth centuries and of French, Portuguese, and other voyages involving the Mascarene Islands of Mauritius and Réunion between 1670 and the 1830s not only shed light on the nature and dynamics of British and French slave trading in the Indian Ocean, but also highlight topics and issues that future research on European slave trading within and beyond this oceanic world will need to address.

  11. A paleoecologic reconstruction of the history of Featherbed Bank, Biscayne National Park, Biscayne Bay, Florida

    Science.gov (United States)

    Stone, Jeffery R.; Cronin, T. M.; Brewster-Wingard, G. L.; Ishman, S.E.; Wardlaw, B.R.; Holmes, C.W.

    2000-01-01

    Using multiple-proxy biological indicators, a paleoecological history of the past 550 years of Featherbed Bank, Biscayne Bay, has been reconstructed from a short (2.26 m) sediment core. Paleoecological changes in ostracode, mollusc, and foraminifer assemblages show that core SEI297-FB-1 can be divided into three distinctly different zones, which together provide evidence for distinct changes in historical environmental conditions at Featherbed Bank. Assemblages from fossil biotic communities within zone 1, representing approximately 1440 to 1550 AD, are characterized by open-marine biota with relatively limited numbers of epiphytic biota. Molluscan faunal indicators suggest the sediment was capable of supporting infaunal organisms and that faunal richness was relatively limited during this time period. A change in the biotic community occurred around 1550 AD and continued until the late 1800's distinguishing zone 2. Fossil biotic indicators from zone 2 show a strong dominance of epiphytic organisms within all of the biotic communities examined. Foraminifers, molluscs, and ostracodes capable of subsisting in salinities slightly lower than normal marine begin to flourish in this time period, and there is a marked decline in infaunal molluscs. Zone 2 assemblages are replaced around 1900 AD by increased numbers of organisms that typify open-marine conditions and a return to decreased epiphytic assemblages, similar to zone 1. Zone 3 assemblages, however, show some strong dissimilarities from zone 1, including limited infaunal molluscs, increased abundances of the ostracode Malzella floridana, and a significant increase in molluscan faunal richness.

  12. DIYABC v2.0: a software to make approximate Bayesian computation inferences about population history using single nucleotide polymorphism, DNA sequence and microsatellite data.

    Science.gov (United States)

    Cornuet, Jean-Marie; Pudlo, Pierre; Veyssier, Julien; Dehne-Garcia, Alexandre; Gautier, Mathieu; Leblois, Raphaël; Marin, Jean-Michel; Estoup, Arnaud

    2014-04-15

    DIYABC is a software package for a comprehensive analysis of population history using approximate Bayesian computation on DNA polymorphism data. Version 2.0 implements a number of new features and analytical methods. It allows (i) the analysis of single nucleotide polymorphism data at large number of loci, apart from microsatellite and DNA sequence data, (ii) efficient Bayesian model choice using linear discriminant analysis on summary statistics and (iii) the serial launching of multiple post-processing analyses. DIYABC v2.0 also includes a user-friendly graphical interface with various new options. It can be run on three operating systems: GNU/Linux, Microsoft Windows and Apple Os X. Freely available with a detailed notice document and example projects to academic users at http://www1.montpellier.inra.fr/CBGP/diyabc CONTACT: estoup@supagro.inra.fr Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  13. A comprehensive and integrative reconstruction of evolutionary history for Anomura (Crustacea: Decapoda).

    Science.gov (United States)

    Bracken-Grissom, Heather D; Cannon, Maren E; Cabezas, Patricia; Feldmann, Rodney M; Schweitzer, Carrie E; Ahyong, Shane T; Felder, Darryl L; Lemaitre, Rafael; Crandall, Keith A

    2013-06-20

    The infraorder Anomura has long captivated the attention of evolutionary biologists due to its impressive morphological diversity and ecological adaptations. To date, 2500 extant species have been described but phylogenetic relationships at high taxonomic levels remain unresolved. Here, we reconstruct the evolutionary history-phylogeny, divergence times, character evolution and diversification-of this speciose clade. For this purpose, we sequenced two mitochondrial (16S and 12S) and three nuclear (H3, 18S and 28S) markers for 19 of the 20 extant families, using traditional Sanger and next-generation 454 sequencing methods. Molecular data were combined with 156 morphological characters in order to estimate the largest anomuran phylogeny to date. The anomuran fossil record allowed us to incorporate 31 fossils for divergence time analyses. Our best phylogenetic hypothesis (morphological + molecular data) supports most anomuran superfamilies and families as monophyletic. However, three families and eleven genera are recovered as para- and polyphyletic. Divergence time analysis dates the origin of Anomura to the Late Permian ~259 (224-296) MYA with many of the present day families radiating during the Jurassic and Early Cretaceous. Ancestral state reconstruction suggests that carcinization occurred independently 3 times within the group. The invasion of freshwater and terrestrial environments both occurred between the Late Cretaceous and Tertiary. Diversification analyses found the speciation rate to be low across Anomura, and we identify 2 major changes in the tempo of diversification; the most significant at the base of a clade that includes the squat-lobster family Chirostylidae. Our findings are compared against current classifications and previous hypotheses of anomuran relationships. Many families and genera appear to be poly- or paraphyletic suggesting a need for further taxonomic revisions at these levels. A divergence time analysis provides key insights

  14. Reconstructing the History of Mesoamerican Populations through the Study of the Mitochondrial DNA Control Region

    Science.gov (United States)

    Gorostiza, Amaya; Acunha-Alonzo, Víctor; Regalado-Liu, Lucía; Tirado, Sergio; Granados, Julio; Sámano, David; Rangel-Villalobos, Héctor; González-Martín, Antonio

    2012-01-01

    The study of genetic information can reveal a reconstruction of human population’s history. We sequenced the entire mtDNA control region (positions 16.024 to 576 following Cambridge Reference Sequence, CRS) of 605 individuals from seven Mesoamerican indigenous groups and one Aridoamerican from the Greater Southwest previously defined, all of them in present Mexico. Samples were collected directly from the indigenous populations, the application of an individual survey made it possible to remove related or with other origins samples. Diversity indices and demographic estimates were calculated. Also AMOVAs were calculated according to different criteria. An MDS plot, based on FST distances, was also built. We carried out the construction of individual networks for the four Amerindian haplogroups detected. Finally, barrier software was applied to detect genetic boundaries among populations. The results suggest: a common origin of the indigenous groups; a small degree of European admixture; and inter-ethnic gene flow. The process of Mesoamerica’s human settlement took place quickly influenced by the region’s orography, which development of genetic and cultural differences facilitated. We find the existence of genetic structure is related to the region’s geography, rather than to cultural parameters, such as language. The human population gradually became fragmented, though they remained relatively isolated, and differentiated due to small population sizes and different survival strategies. Genetic differences were detected between Aridoamerica and Mesoamerica, which can be subdivided into “East”, “Center”, “West” and “Southeast”. The fragmentation process occurred mainly during the Mesoamerican Pre-Classic period, with the Otomí being one of the oldest groups. With an increased number of populations studied adding previously published data, there is no change in the conclusions, although significant genetic heterogeneity can be detected in Pima

  15. Reconstruction of the evolutionary history of the LexA-binding sequence.

    Science.gov (United States)

    Mazón, Gerard; Erill, Ivan; Campoy, Susana; Cortés, Pilar; Forano, Evelyne; Barbé, Jordi

    2004-11-01

    In recent years, the recognition sequence of the SOS repressor LexA protein has been identified for several bacterial clades, such as the Gram-positive, green non-sulfur bacteria and Cyanobacteria phyla, or the 'Alphaproteobacteria', 'Deltaproteobacteria' and 'Gammaproteobacteria' classes. Nevertheless, the evolutionary relationship among these sequences and the proteins that recognize them has not been analysed. Fibrobacter succinogenes is an anaerobic Gram-negative bacterium that branched from a common bacterial ancestor immediately before the Proteobacteria phylum. Taking advantage of its intermediate position in the phylogenetic tree, and in an effort to reconstruct the evolutionary history of LexA-binding sequences, the F. succinogenes lexA gene has been isolated and its product purified to identify its DNA recognition motif through electrophoretic mobility assays and footprinting experiments. After comparing the available LexA DNA-binding sequences with the F. succinogenes one, reported here, directed mutagenesis of the F. succinogenes LexA-binding sequence and phylogenetic analyses of LexA proteins have revealed the existence of two independent evolutionary lanes for the LexA recognition motif that emerged from the Gram-positive box: one generating the Cyanobacteria and 'Alphaproteobacteria' LexA-binding sequences, and the other giving rise to the F. succinogenes and Myxococcus xanthus ones, in a transitional step towards the current 'Gammaproteobacteria' LexA box. The contrast between the results reported here and the phylogenetic data available in the literature suggests that, some time after its emergence as a distinct bacterial class, the 'Alphaproteobacteria' lost its vertically received lexA gene, but received later through lateral gene transfer a new lexA gene belonging to either a cyanobacterium or a bacterial species closely related to this phylum. This constitutes the first report based on experimental evidence of lateral gene transfer in the

  16. RECONSTRUCTING THE SOLAR WIND FROM ITS EARLY HISTORY TO CURRENT EPOCH

    International Nuclear Information System (INIS)

    Airapetian, Vladimir S.; Usmanov, Arcadi V.

    2016-01-01

    Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion

  17. Reconstruction of fire history of the Yukon-Kuskokwim Delta, Alaska

    Science.gov (United States)

    Sae-lim, J.; Mann, P. J.; Russell, J. M.; Natali, S.; Vachula, R. S.; Schade, J. D.; Holmes, R. M.

    2017-12-01

    Wildfire is an important disturbance in Arctic ecosystems and can cause abrupt perturbations in global carbon cycling and atmospheric chemistry. Over the next few decades, arctic fire frequency, intensity and extent is projected to increase due to anthropogenic climate change, as regional air temperatures are increasing at more than twice the global average. In order to more accurately predict the anthropogenic impacts of climate change on tundra fire regimes, it is critical to have detailed knowledge of the natural frequency and extent of past wildfires. However, reliable historical records only extend back a few hundred years, whereas climatic shifts have affected fire regimes for thousands of years. In this work we analyzed a lake sediment core collected from the Yukon-Kuskokwim (YK) Delta, Alaska, a region that has recently experienced fire and is predicted to see increasing fire frequency in the near future. Our primary lake site is situated adjacent to recent burned areas, providing a `calibration' point and also attesting to the sensitivity of the sites. We used charcoal counts alongside geochemical measurements (C:N, 13C, 15N, 210Pb, X-ray fluorescence analyses of elemental chemistry) to reconstruct past fire history and ecosystem responses during the late Holocene. Average C (%C) and N concentrations (%N) in the core were 8.10 ±0.74% and 0.48 ±0.05%, resulting in C:N ratios of 16.80 ±0.74. The values are generally stable, with no obvious trend in C, N or C:N with depth; however, fluctuations in sediment composition in other nearby lake sediment cores possibly suggests shifts in lake conditions (oxic, anoxic) over time that might result from paleofires. This study provides a baseline for estimated fire return intervals in the YK Delta that will support more accurate projections of tundra fire frequencies under a changing climate.

  18. Evolution of life history and behavior in Hominidae: towards phylogenetic reconstruction of the chimpanzee-human last common ancestor.

    Science.gov (United States)

    Duda, Pavel; Zrzavý, Jan

    2013-10-01

    The origin of the fundamental behavioral differences between humans and our closest living relatives is one of the central issues of evolutionary anthropology. The prominent, chimpanzee-based referential model of early hominin behavior has recently been challenged on the basis of broad multispecies comparisons and newly discovered fossil evidence. Here, we argue that while behavioral data on extant great apes are extremely relevant for reconstruction of ancestral behaviors, these behaviors should be reconstructed trait by trait using formal phylogenetic methods. Using the widely accepted hominoid phylogenetic tree, we perform a series of character optimization analyses using 65 selected life-history and behavioral characters for all extant hominid species. This analysis allows us to reconstruct the character states of the last common ancestors of Hominoidea, Hominidae, and the chimpanzee-human last common ancestor. Our analyses demonstrate that many fundamental behavioral and life-history attributes of hominids (including humans) are evidently ancient and likely inherited from the common ancestor of all hominids. However, numerous behaviors present in extant great apes represent their own terminal autapomorphies (both uniquely derived and homoplastic). Any evolutionary model that uses a single extant species to explain behavioral evolution of early hominins is therefore of limited use. In contrast, phylogenetic reconstruction of ancestral states is able to provide a detailed suite of behavioral, ecological and life-history characters for each hypothetical ancestor. The living great apes therefore play an important role for the confident identification of the traits found in the chimpanzee-human last common ancestor, some of which are likely to represent behaviors of the fossil hominins. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Reconstruction of paleostorm history using geochemical proxies in sediment cores from Eastern Lake, Florida

    Science.gov (United States)

    Das, O.; Wang, Y.; Donoghue, J. F.; Coor, J. L.; Kish, S.; Elsner, J.; Hu, X. B.; Niedoroda, A. W.; Ye, M.; Xu, Y.

    2009-12-01

    Analysis of geochemical proxies of coastal lake sediments provides a useful tool for reconstructing paleostorm history. Such paleostorm records can help constrain models that are used to predict future storm events. In this study, we collected two sediment cores (60 and 103 cm long, respectively) from the center of Eastern Lake located on the Gulf coast of NW Florida. These cores, which are mainly composed of organic-rich mud and organic-poor sand, were sub-sampled at 2-3mm intervals for analyses of their organic carbon and nitrogen concentrations as well as δ13C and δ15N isotopic signatures. Selected samples were submitted for radiocarbon dating in order to establish a chronological framework for the interpretation of the geochemical data. There are significant variations in δ13C, δ15N, C%, N% and C/N with depth. The δ13C and δ15N values vary from -21.8‰ to -26.7‰ and 2.6‰ to 5‰, respectively. The stable isotopic signatures of carbon and nitrogen indicate that the sources of organic matter in sediments include terrestrial C3 type vegetation, marine input from Gulf of Mexico and biological productivity within the lake, such as phytoplankton and zooplankton growing in the lacustrine environment. The δ13C and δ15N values exhibit significant negative excursions by 2‰ in a 30 cm thick sand layer, bounded by a rapid return to the base value. A positive shift in the δ15N record observed in the upper part of the cores likely reflects increased anthropogenic input of N such as sewage or septic tank effluents associated with recent development of areas around the lake for human habitation. Similarly, organic C% and N% range from 5.8 to 0.4 and 0.4 to 0.1, respectively. A prominent negative shift by 2σ relative to the baseline in C% and N% has been observed at approx. 55 to 58 cm depth, consisting of an organic-poor sand layer. This shift in C% and N% can be correlated with the negative shift in the δ13C and δ15N values, indicating a major storm event

  20. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    International Nuclear Information System (INIS)

    Muir Wood, R.

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can most readily be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume 'push'. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10 -9 /year. Within the Baltic Shield long term strain rates have been around 10 -1 1/year, with little evidence for evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently very little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the 'Current Tectonic Regime' is of Quaternary age although the orientation of the major stress axis has remained approximately consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather

  1. Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years

    Energy Technology Data Exchange (ETDEWEB)

    Muir Wood, R. [EQE International Ltd (United Kingdom)

    1995-12-01

    In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.

  2. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history

    OpenAIRE

    Cherry, Joshua L.

    2017-01-01

    Background Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Results Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data....

  3. Cross-sectional follow-up of voice outcomes in children who have a history of airway reconstruction surgery.

    Science.gov (United States)

    Cohen, W; Wynne, D M; Lloyd, S; Townsley, R B

    2018-04-01

    This study reports vocal function in a cross-section of children with subglottic stenosis. Each child had a history of laryngotracheal reconstruction and/or cricotracheal resection surgery. Vocal function was measured using laryngoscopy, acoustic analysis, perceptual evaluation and impact of voice on quality of life. All patients aged >5 years with history of laryngotracheal reconstruction and/or cricotracheal resection surgery at the Scottish National Complex Airways service were invited to participate. Data were gathered in the Royal Hospital for Children in Glasgow in a single outpatient appointment. Twelve of 56 former patients (aged 5-27) provided a voice sample and eleven consented to awake laryngoscopy. All consented for detailed evaluation of their medical records. Acoustic analysis of fundamental frequency and pitch perturbation was conducted on sustained vowel [a]. Perceptual evaluation was conducted by 4 trained listeners on a series of spoken sentences. Impact on quality of life was measured using the paediatric voice-related quality of life questionnaire. Laryngeal function was descriptively evaluated. Four children had normal voice acoustically, perceptually and in relation to voice-related quality of life. One of these had vocal fold nodules unrelated to surgical history. Two other children had "near normal" vocal function, defined where most voice measurements fell within the normal range. Normal or "near normal" voice is a possible outcome for children who have had this surgery. Where there is an ongoing complex medical condition, voice outcome may be poorer. © 2017 John Wiley & Sons Ltd.

  4. A comparison in a youth population between those with and without a history of concussion using biomechanical reconstruction.

    Science.gov (United States)

    Post, Andrew; Hoshizaki, T Blaine; Gilchrist, Michael D; Koncan, David; Dawson, Lauren; Chen, Wesley; Ledoux, Andrée-Anne; Zemek, Roger

    2017-04-01

    OBJECTIVE Concussion is a common topic of research as a result of the short- and long-term effects it can have on the affected individual. Of particular interest is whether previous concussions can lead to a biomechanical susceptibility, or vulnerability, to incurring further head injuries, particularly for youth populations. The purpose of this research was to compare the impact biomechanics of a concussive event in terms of acceleration and brain strains of 2 groups of youths: those who had incurred a previous concussion and those who had not. It was hypothesized that the youths with a history of concussion would have lower-magnitude biomechanical impact measures than those who had never suffered a previous concussion. METHODS Youths who had suffered a concussion were recruited from emergency departments across Canada. This pool of patients was then separated into 2 categories based on their history of concussion: those who had incurred 1 or more previous concussions, and those who had never suffered a concussion. The impact event that resulted in the brain injury was reconstructed biomechanically using computational, physical, and finite element modeling techniques. The output of the events was measured in biomechanical parameters such as energy, force, acceleration, and brain tissue strain to determine if those patients who had a previous concussion sustained a brain injury at lower magnitudes than those who had no previously reported concussion. RESULTS The results demonstrated that there was no biomechanical variable that could distinguish between the concussion groups with a history of concussion versus no history of concussion. CONCLUSIONS The results suggest that there is no measureable biomechanical vulnerability to head impact related to a history of concussions in this youth population. This may be a reflection of the long time between the previous concussion and the one reconstructed in the laboratory, where such a long period has been associated with

  5. The Influence of Sampling Density on Bayesian Age-Depth Models and Paleoclimatic Reconstructions - Lessons Learned from Lake Titicaca - Bolivia/Peru

    Science.gov (United States)

    Salenbien, W.; Baker, P. A.; Fritz, S. C.; Guedron, S.

    2014-12-01

    Lake Titicaca is one of the most important archives of paleoclimate in tropical South America, and prior studies have elucidated patterns of climate variation at varied temporal scales over the past 0.5 Ma. Yet, slow sediment accumulation rates in the main deeper basin of the lake have precluded analysis of the lake's most recent history at high resolution. To obtain a paleoclimate record of the last few millennia at multi-decadal resolution, we obtained five short cores, ranging from 139 to 181 cm in length, from the shallower Wiñaymarka sub-basin of of Lake Titicaca, where sedimentation rates are higher than in the lake's main basin. Selected cores have been analyzed for their geochemical signature by scanning XRF, diatom stratigraphy, sedimentology, and for 14C age dating. A total of 72 samples were 14C-dated using a Gas Ion Source automated high-throughput method for carbonate samples (mainly Littoridina sp. and Taphius montanus gastropod shells) at NOSAMS (Woods Hole Oceanographic Institute) with an analytical precision higher than 2%. The method has lower analytical precision compared with traditional AMS radiocarbon dating, but the lower cost enables analysis of a larger number of samples, and the error associated with the lower precision is relatively small for younger samples (< ~8,000 years). A 172-cm-long core was divided into centimeter long sections, and 47 14C dates were obtained from 1-cm intervals, averaging one date every 3-4 cm. The other cores were radiocarbon dated with a sparser sampling density that focused on visual unconformities and shell beds. The high-resolution radiocarbon analysis reveals complex sedimentation patterns in visually continuous sections, with abundant indicators of bioturbated or reworked sediments and periods of very rapid sediment accumulation. These features are not evident in the sparser sampling strategy but have significant implications for reconstructing past lake level and paleoclimatic history.

  6. Investigating Efficiency of Time Domain Curve fitters Versus Filtering for Rectification of Displacement Histories Reconstructed from Acceleration Measurements

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Brincker, Rune

    2008-01-01

    Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied...... on a structure. In brief the major problem that accompanies reconstruction of true displacement from acceleration record is the unreal drift observed in the double integrated acceleration. Purpose of the present work is to address source of the problem, introduce its treatments, show how they work and compare...

  7. RECONSTRUCTING THE EVOLUTIONARY HISTORY OF THE FOREST FUNGAL PATHOGEN, ARMILLARIA MELLEA, IN A TEMPERATE WORLDWIDE POPULATIONS

    Science.gov (United States)

    The forest pathogen Armillaria mellea s.s. (Basidiomycota, Physalacriaceae) is among the most significant forest pathogens causing root rot in northern temperate forest trees worldwide. Phylogenetic reconstructions for A. mellea show distinct European, Asian and North American lineages. The North Am...

  8. Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course

    Science.gov (United States)

    Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland

    2012-01-01

    The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…

  9. ROLE OF ARCHIVAL DOCUMENTS IN RECONSTRUCTION OF HISTORY OF STUDENT’S DEPARTMENT OF LIBRARY OF THE NOVOROSSIYSK UNIVERSIT

    Directory of Open Access Journals (Sweden)

    М. О. Подрезова

    2016-12-01

    Full Text Available In article, made attempt to unite separate data on history and activity of Student’s library of the Odessa I. I. Mechnikov National University. By means of newly opened archival materials, some moments from history of creation of fund of the Room for reading students in Richelieu Lyceum are reconstructed. The role of the trustee of the Odessa educational district N. I. Pirogov in creation of Student’s library and process of its further transformation in student’s department of library of the Novorossiysk University is shown. The moments of completing of fund of library by donation and purchase of books in different years of its activity are considered. Data on obtaining the books and money according to the will of the university doctor P. A. Ivanov aimed at the development of educational and auxiliary institutions of the Novorossiysk University are in detail stated.

  10. Contact-induced change in Dolgan : an investigation into the role of linguistic data for the reconstruction of a people's (pre)history

    NARCIS (Netherlands)

    Stapert, Eugénie

    2013-01-01

    This study explores the role of linguistic data in the reconstruction of Dolgan (pre)history. While most ethno-linguistic groups have a longstanding history and a clear ethnic and linguistic affiliation, the formation of the Dolgans has been a relatively recent development, and their ethnic origins

  11. Bayesian and likelihood phylogenetic reconstructions of morphological traits are not discordant when taking uncertainty into consideration: a comment on Puttick et al.

    Science.gov (United States)

    Brown, Joseph W; Parins-Fukuchi, Caroline; Stull, Gregory W; Vargas, Oscar M; Smith, Stephen A

    2017-10-11

    Puttick et al. (2017 Proc. R. Soc. B 284 , 20162290 (doi:10.1098/rspb.2016.2290)) performed a simulation study to compare accuracy among methods of inferring phylogeny from discrete morphological characters. They report that a Bayesian implementation of the Mk model (Lewis 2001 Syst. Biol. 50 , 913-925 (doi:10.1080/106351501753462876)) was most accurate (but with low resolution), while a maximum-likelihood (ML) implementation of the same model was least accurate. They conclude by strongly advocating that Bayesian implementations of the Mk model should be the default method of analysis for such data. While we appreciate the authors' attempt to investigate the accuracy of alternative methods of analysis, their conclusion is based on an inappropriate comparison of the ML point estimate, which does not consider confidence, with the Bayesian consensus, which incorporates estimation credibility into the summary tree. Using simulation, we demonstrate that ML and Bayesian estimates are concordant when confidence and credibility are comparably reflected in summary trees, a result expected from statistical theory. We therefore disagree with the conclusions of Puttick et al. and consider their prescription of any default method to be poorly founded. Instead, we recommend caution and thoughtful consideration of the model or method being applied to a morphological dataset. © 2017 The Author(s).

  12. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  13. History of cervical spine surgery: from nihilism to advanced reconstructive surgery.

    Science.gov (United States)

    Dweik, A; Van den Brande, E; Kossmann, T; Maas, A I R

    2013-11-01

    Review of literature. To review and analyze the evolution of cervical spine surgery from ancient times to current practice. The aim is to present an accessible overview, primarily intended for a broad readership. Descriptive literature review and analysis of the development of cervical spine surgery from the prehistoric era until today. The first evidence for surgical treatment of spinal disorders dates back to approximately 1500 BC. Conservative approaches to treatment have been the hallmark for thousands of years, but over the past 50 years progress has been rapid. We illustrate how nations have added elements to this complex subject and how knowledge has surpassed borders and language barriers. Transferral of knowledge occurred from Babylon (Bagdad) to Old Egypt, to the Greek and Roman empires and finally via the Middle East (Bagdad and Damascus) back to Europe. Recent advances in the field of anesthesia, imaging and spinal instrumentation have changed long-standing nihilism in the treatment of cervical spine pathologies to the current practice of advanced reconstructive surgery of the cervical spine. A critical approach to the evaluation of benefits and complications of these advanced surgical techniques for treatment of cervical spine disorders is required. Advances in surgery now permit full mechanical reconstruction of the cervical spine. However, despite substantial experimental progress, spinal cord repair and restoration of lost functions remain a challenge. Modern surgeons are still looking for the best way to manage spine disorders.

  14. A practical exact maximum compatibility algorithm for reconstruction of recent evolutionary history.

    Science.gov (United States)

    Cherry, Joshua L

    2017-02-23

    Maximum compatibility is a method of phylogenetic reconstruction that is seldom applied to molecular sequences. It may be ideal for certain applications, such as reconstructing phylogenies of closely-related bacteria on the basis of whole-genome sequencing. Here I present an algorithm that rapidly computes phylogenies according to a compatibility criterion. Although based on solutions to the maximum clique problem, this algorithm deals properly with ambiguities in the data. The algorithm is applied to bacterial data sets containing up to nearly 2000 genomes with several thousand variable nucleotide sites. Run times are several seconds or less. Computational experiments show that maximum compatibility is less sensitive than maximum parsimony to the inclusion of nucleotide data that, though derived from actual sequence reads, has been identified as likely to be misleading. Maximum compatibility is a useful tool for certain phylogenetic problems, such as inferring the relationships among closely-related bacteria from whole-genome sequence data. The algorithm presented here rapidly solves fairly large problems of this type, and provides robustness against misleading characters than can pollute large-scale sequencing data.

  15. Revised palaeogeographical reconstruction and avulsion history of the Holocene Rhine-Meuse delta, The Netherlands

    NARCIS (Netherlands)

    Stouthamer, E.; Cohen, K.M.; Hoek, W.Z.; Pierik, H.J.; Taal, L.J.; Hijma, M.P.; Bos, I.J.

    2013-01-01

    In the Holocene Rhine-Meuse delta, the geography, architecture, and chronology of the channel belts and their flood basins is known in exceptional high detail. This is due to a long history of intensive geological, geomorphological, and archeological research by various universities

  16. Using sedimentary archives to reconstruct pollution history and sediment provenance: The Ohre River, Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Matys Grygar, Tomáš; Elznicová, J.; Kiss, T.; Smith, H. G.

    2016-01-01

    Roč. 144, SEP (2016), s. 109-129 ISSN 0341-8162 R&D Projects: GA ČR GA15-00340S Institutional support: RVO:61388980 Keywords : Polluted floodplains * Pollution history * Fluvial archives * Sediment provenance * Environmental geochemistry Subject RIV: DD - Geochemistry Impact factor: 3.191, year: 2016

  17. Reconstruction of burial history, temperature, source rock maturity and hydrocarbon generation in the northwestern Dutch offshore

    NARCIS (Netherlands)

    Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten

    2012-01-01

    3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the

  18. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    International Nuclear Information System (INIS)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R.; Kubik, P.W.; Bonani, G.; Hajdas, I.; Suter, M.

    1997-01-01

    10 Be, 14 C, and 26 Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs

  19. Reconstruction of exposure histories of meteorites from Antarctica and the Sahara

    Energy Technology Data Exchange (ETDEWEB)

    Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)

    1997-09-01

    {sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.

  20. Bayesian methods to restore and re build images: application to gamma-graphy and to photofission tomography; Methodes bayesiennes pour la restauration et la reconstruction d`images application a la gammagraphie et a la tomographie par photofissions

    Energy Technology Data Exchange (ETDEWEB)

    Stawinski, G

    1998-10-26

    Bayesian algorithms are developed to solve inverse problems in gamma imaging and photofission tomography. The first part of this work is devoted to the modeling of our measurement systems. Two models have been found for both applications: the first one is a simple conventional model and the second one is a cascaded point process model. EM and MCMC Bayesian algorithms for image restoration and image reconstruction have been developed for these models and compared. The cascaded point process model does not improve significantly the results previously obtained by the classical model. To original approaches have been proposed, which increase the results previously obtained. The first approach uses an inhomogeneous Markov Random Field as a prior law, and makes the regularization parameter spatially vary. However, the problem of the estimation of hyper-parameters has not been solved. In the case of the deconvolution of point sources, a second approach has been proposed, which introduces a high level prior model. The picture is modeled as a list of objects, whose parameters and number are unknown. The results obtained with this method are more accurate than those obtained with the conventional Markov Random Field prior model and require less computational costs. (author)

  1. Genetic Structuration, Demography and Evolutionary History of Mycobacterium tuberculosis LAM9 Sublineage in the Americas as Two Distinct Subpopulations Revealed by Bayesian Analyses

    Science.gov (United States)

    Reynaud, Yann; Millet, Julie; Rastogi, Nalin

    2015-01-01

    Tuberculosis (TB) remains broadly present in the Americas despite intense global efforts for its control and elimination. Starting from a large dataset comprising spoligotyping (n = 21183 isolates) and 12-loci MIRU-VNTRs data (n = 4022 isolates) from a total of 31 countries of the Americas (data extracted from the SITVIT2 database), this study aimed to get an overview of lineages circulating in the Americas. A total of 17119 (80.8%) strains belonged to the Euro-American lineage 4, among which the most predominant genotypic family belonged to the Latin American and Mediterranean (LAM) lineage (n = 6386, 30.1% of strains). By combining classical phylogenetic analyses and Bayesian approaches, this study revealed for the first time a clear genetic structuration of LAM9 sublineage into two subpopulations named LAM9C1 and LAM9C2, with distinct genetic characteristics. LAM9C1 was predominant in Chile, Colombia and USA, while LAM9C2 was predominant in Brazil, Dominican Republic, Guadeloupe and French Guiana. Globally, LAM9C2 was characterized by higher allelic richness as compared to LAM9C1 isolates. Moreover, LAM9C2 sublineage appeared to expand close to twenty times more than LAM9C1 and showed older traces of expansion. Interestingly, a significant proportion of LAM9C2 isolates presented typical signature of ancestral LAM-RDRio MIRU-VNTR type (224226153321). Further studies based on Whole Genome Sequencing of LAM strains will provide the needed resolution to decipher the biogeographical structure and evolutionary history of this successful family. PMID:26517715

  2. From WWII to Kingston, Ontario: The History of Queen's University School of Medicine, Division of Plastic and Reconstructive Surgery.

    Science.gov (United States)

    Chung, Karen; Wyllie, Kenneth; Davidson, John

    2016-01-01

    To describe the origin and development of the Division of Plastic and Reconstructive Surgery at the Queen's University School of Medicine (Kingston, Ontario). Resarch ethics board approval and privacy agreements from the Kingston General Hospital (KGH, Kingston, Ontario) medical archives were obtained. Primary and secondary data sources were identified. A systematic examination of newspaper archives, research literature, KGH medical advisory committee meeting minutes, and testimonies from Dr Kenneth Wyllie and Dr John Davidson were obtained. In 1949, Dr Albert Ross Tilley arrived at Queen's University in Kingston, Ontario. There, Tilley initiated the Burn Unit at the KGH and began monthly teaching during the academic semester. Ken Wyllie (Meds '55), Lloyd Carlson (Meds '57) and John Emery (Meds '57) were the notable progeny of his early initiatives. In 1963, Kenneth Wyllie founded the Division of Plastic and Reconstructive Surgery in Kingston, Ontario, having completed plastic surgery training in Toronto and Edinburgh with experiences in Stockholm (Sweden), Paris (France) and Baltimore (Maryland, USA). He was shortly joined by Pat Shoemaker (Meds '66). John Davidson (Meds '82) arrived in 1989, bringing an interest in microsurgery and critical inquiry to the division. Five notable surgeons, Cartotto (Meds '88), Watkins, Watters, Meathrel (Meds '03) and McKay, further enhanced the Division's clinical and academic mission. The collective activity of the Division of Plastic and Reconstructive Surgery at Queen's School of Medicine in its 66-year history has encouraged more than 40 others to pursue distinguished careers in the specialty throughout North America, including three past presidents of the Canadian Society of Plastic Surgeons.

  3. SAR wavefront reconstruction using motion-compensated phase history (polar format) data and DPCA-based GMTI

    Science.gov (United States)

    Soumekh, Mehrdad; Worrell, Steven W.; Zelnio, Edmund G.; Keaffaber, Brett L.

    2000-08-01

    This paper address the problem of processing an X-band SAR database that was originally intended for processing via a polar format imaging algorithm. In our approach, we use the approximation-free SAR wavefront reconstruction. For this, the measured and motion compensated phase history (polar format) data are processed in a multi-dimensional digital signal processing algorithm that yields alias-free slow-time samples. The resultant database is used for wavefront image formation. The X-band SAR system also provides a two channel along-track monopulse database. The alias-free monopulse SAR data are used in a coherent signal subspace algorithm for Ground Moving Target Indication (GMTI). Results are provided.

  4. Assessing dorsal scute microchemistry for reconstruction of shortnose sturgeon life histories

    Science.gov (United States)

    Altenritter, Matthew E.; Kinnison, Michael T.; Zydlewski, Gayle B.; Secor, David H.; Zydlewski, Joseph D.

    2015-01-01

    The imperiled status of sturgeons worldwide places priority on the identification and protection of critical habitats. We assessed the micro-structural and micro-chemical scope for a novel calcified structure, dorsal scutes, to be used for reconstruction of past habitat use and group separation in shortnose sturgeon (Acipenser brevirostrum). Dorsal scutes contained a dual-layered structure composed of a thin multi-layered translucent zone lying dorsally above a thicker multi-layered zone. Banding in the thick multi-layered zone correlated strongly with pectoral fin spine annuli supporting the presence of chronological structuring that could contain a chemical record of past environmental exposure. Trace element profiles (Sr:Ca), collected using both wavelength dispersive electron microprobe analysis and laser ablation inductively coupled mass spectrometry, suggest scutes record elemental information useful for tracing transitions between freshwater and marine environments. Moreover, mirror-image like Sr:Ca profiles were observed across the dual-zone structuring of the scute that may indicate duplication of the microchemical profile in a single structure. Additional element:calcium ratios measured in natal regions of dorsal scutes (Ba:Ca, Mg:Ca) suggest the potential for further refinement of techniques for identification of river systems of natal origin. In combination, our results provide proof of concept that dorsal scutes possess the necessary properties to be used as structures for reconstructions of past habitat use in sturgeons. Importantly, scutes may be collected non-lethally and with less injury than current structures, like otoliths and fin spines, affording an opportunity for broader application of microchemical techniques.

  5. Cosmology with hybrid expansion law: scalar field reconstruction of cosmic history and observational constraints

    International Nuclear Information System (INIS)

    Akarsu, Özgür; Kumar, Suresh; Myrzakulov, R.; Sami, M.; Xu, Lixin

    2014-01-01

    In this paper, we consider a simple form of expansion history of Universe referred to as the hybrid expansion law - a product of power-law and exponential type of functions. The ansatz by construction mimics the power-law and de Sitter cosmologies as special cases but also provides an elegant description of the transition from deceleration to cosmic acceleration. We point out the Brans-Dicke realization of the cosmic history under consideration. We construct potentials for quintessence, phantom and tachyon fields, which can give rise to the hybrid expansion law in general relativity. We investigate observational constraints on the model with hybrid expansion law applied to late time acceleration as well as to early Universe a la nucleosynthesis

  6. Reconstruction of burial history, temperature, source rock maturity and hydrocarbon generation in the northwestern Dutch offshore

    OpenAIRE

    Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten

    2012-01-01

    3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the input geological model. An updated and refined palaeo water depth curve and newly refined sediment water interface temperatures (SWIT) are used in the simulation. Basal heat flow is calculated usi...

  7. History of consumer demand theory 1871-1971: A Neo-Kantian rational reconstruction

    OpenAIRE

    Ivan Moscati

    2005-01-01

    This paper examines the history of the neoclassical theory of consumer demand from 1871 to 1971 by bringing into play the knowledge theory of the Marburg School, a Neo-Kantian philosophical movement. The work aims to show the usefulness of a Marburg-inspired epistemology in rationalizing the development of consumer analysis and, more generally, to understand the principles that regulate the process of knowing in neoclassical economics.

  8. DNA analysis of ancient dogs of the Americas: identifying possible founding haplotypes and reconstructing population histories.

    Science.gov (United States)

    Witt, Kelsey E; Judd, Kathleen; Kitchen, Andrew; Grier, Colin; Kohler, Timothy A; Ortman, Scott G; Kemp, Brian M; Malhi, Ripan S

    2015-02-01

    As dogs have traveled with humans to every continent, they can potentially serve as an excellent proxy when studying human migration history. Past genetic studies into the origins of Native American dogs have used portions of the hypervariable region (HVR) of mitochondrial DNA (mtDNA) to indicate that prior to European contact the dogs of Native Americans originated in Eurasia. In this study, we summarize past DNA studies of both humans and dogs to discuss their population histories in the Americas. We then sequenced a portion of the mtDNA HVR of 42 pre-Columbian dogs from three sites located in Illinois, coastal British Columbia, and Colorado, and identify four novel dog mtDNA haplotypes. Next, we analyzed a dataset comprised of all available ancient dog sequences from the Americas to infer the pre-Columbian population history of dogs in the Americas. Interestingly, we found low levels of genetic diversity for some populations consistent with the possibility of deliberate breeding practices. Furthermore, we identified multiple putative founding haplotypes in addition to dog haplotypes that closely resemble those of wolves, suggesting admixture with North American wolves or perhaps a second domestication of canids in the Americas. Notably, initial effective population size estimates suggest at least 1000 female dogs likely existed in the Americas at the time of the first known canid burial, and that population size increased gradually over time before stabilizing roughly 1200 years before present. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Reconstructing and analyzing China's fifty-nine year (1951–2009 drought history using hydrological model simulation

    Directory of Open Access Journals (Sweden)

    Z. Y. Wu

    2011-09-01

    Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North

  10. Organic molecular paleohypsometry: A new approach to reconstructing the paleoelevation history of an orogen

    Science.gov (United States)

    Hren, M. T.; Ouimet, W. B.

    2017-12-01

    Paleoelevation data is critical to understanding the links and feedbacks between rock-uplift and erosion yet few approaches have proved successful in quantifying changes in paleoelevation rapidly eroding, tropical landscapes. In addition, quantitative methods of reconstructing paleoelevation from marine sedimentary archives are lacking. Here we present a new approach to quantifying changes in paleoelevation that is based on the geochemical signature of organic matter exported via the main river networks of an orogen. This new approach builds on fundamentals of stable isotope paleoaltimetry and is akin to the theory behind cosmogenic isotope records of catchment-integrated erosion. Specifically, we utilize predictable patterns of precipitation and organic molecular biomarker stable isotopes to relate the hypsometry of organic matter in a catchment to the geochemical signal in exported organic carbon. We present data from two sites (the cold temperate White Mountains of New Hampshire, USA and the tropical, rapidly eroding landscape of Taiwan) to demonstrate this relationship between exported carbon geochemistry and catchment hypsometry and the validity of this approach.

  11. Reconstructing depositional processes and history from reservoir stratigraphy: Englebright Lake, Yuba River, northern California

    Science.gov (United States)

    Snyder, N.P.; Wright, S.A.; Alpers, Charles N.; Flint, L.E.; Holmes, C.W.; Rubin, D.M.

    2006-01-01

    Reservoirs provide the opportunity to link watershed history with its stratigraphic record. We analyze sediment cores from a northern California reservoir in the context of hydrologic history, watershed management, and depositional processes. Observations of recent depositional patterns, sediment-transport calculations, and 137CS geochronology support a conceptual model in which the reservoir delta progrades during floods of short duration (days) and is modified during prolonged (weeks to months) drawdowns that rework topset beds and transport sand from topsets to foresets. Sediment coarser than 0.25-0.5 mm. deposits in foresets and topsets, and finer material falls out of suspension as bottomset beds. Simple hydraulic calculations indicate that fine sand (0.063-0.5 mm) is transported into the distal bottomset area only during floods. The overall stratigraphy suggests that two phases of delta building occurred in the reservoir. The first, from dam construction in 1940 to 1970, was heavily influenced by annual, prolonged >20 m drawdowns of the water level. The second, built on top of the first, reflects sedimentation from 1970 to 2002 when the influence of drawdowns was less. Sedimentation rates in the central part of the reservoir have declined ???25% since 1970, likely reflecting a combination of fewer large floods, changes in watershed management, and winnowing of stored hydraulic mining sediment. Copyright 2006 by the American Geophysical Union.

  12. Fiction as Reconstruction of History: Narratives of the Civil War in American Literature

    Directory of Open Access Journals (Sweden)

    Reinhard Isensee

    2009-09-01

    Full Text Available Even after more than 140 years the American Civil War continues to serve as a major source of inspiration for a plethora of literature in various genres. While only amounting to a brief period in American history in terms of years, this war has proved to be one of the central moments for defining the American nation since the second half of the nineteenth century. The facets of the Civil War, its protagonists, places, events, and political, social and cultural underpinnings seem to hold an ongoing fascination for both academic studies and fictional representations. Thus, it has been considered by many the most written-about war in the United States.

  13. RECONSTRUCTING THE ACCRETION HISTORY OF THE GALACTIC STELLAR HALO FROM CHEMICAL ABUNDANCE RATIO DISTRIBUTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Duane M. [Key Laboratory for Research in Galaxies and Cosmology, Shanghai Astronomical Observatory, Chinese Academy of Sciences, 80 Nandan Road, Shanghai 200030 (China); Johnston, Kathryn V. [Department of Astronomy, Columbia University, New York City, NY 10027 (United States); Sen, Bodhisattva; Jessop, Will, E-mail: duane@shao.ac.cn [Department of Statistics, Columbia University, New York City, NY 10027 (United States)

    2015-03-20

    Observational studies of halo stars during the past two decades have placed some limits on the quantity and nature of accreted dwarf galaxy contributions to the Milky Way (MW) stellar halo by typically utilizing stellar phase-space information to identify the most recent halo accretion events. In this study we tested the prospects of using 2D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from 11 “MW-like” halos to generate satellite template sets (STSs) of 2D CARDs of accreted dwarf satellites, which are composed of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of ∼10{sup 3–4} mock observations of stellar chemical abundance ratios ([α/Fe], [Fe/H]) from those 11 halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion histories in relation to the STS used and the sample size. For certain STSs used we typically can identify the relative mass contributions of all accreted satellites to within a factor of two. We also find that this method is particularly sensitive to older accretion events involving low-luminosity dwarfs, e.g., ultra-faint dwarfs—precisely those events that are too ancient to be seen by phase-space studies of stars and too faint to be seen by high-z studies of the early universe. Since our results only exploit two chemical dimensions and near-future surveys promise to provide ∼6–9 dimensions, we conclude that these new high-resolution spectroscopic surveys of the stellar halo will allow us to recover its accretion history—and the luminosity function of infalling dwarf galaxies—across cosmic time.

  14. Accelerating fDOT image reconstruction based on path-history fluorescence Monte Carlo model by using three-level parallel architecture.

    Science.gov (United States)

    Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Luo, Qingming

    2015-10-05

    The excessive time required by fluorescence diffuse optical tomography (fDOT) image reconstruction based on path-history fluorescence Monte Carlo model is its primary limiting factor. Herein, we present a method that accelerates fDOT image reconstruction. We employ three-level parallel architecture including multiple nodes in cluster, multiple cores in central processing unit (CPU), and multiple streaming multiprocessors in graphics processing unit (GPU). Different GPU memories are selectively used, the data-writing time is effectively eliminated, and the data transport per iteration is minimized. Simulation experiments demonstrated that this method can utilize general-purpose computing platforms to efficiently implement and accelerate fDOT image reconstruction, thus providing a practical means of using path-history-based fluorescence Monte Carlo model for fDOT imaging.

  15. A Reconstruction of Development of the Periodic Table Based on History and Philosophy of Science and Its Implications for General Chemistry Textbooks

    Science.gov (United States)

    Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor

    2005-01-01

    The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…

  16. Reconstructing the Life Histories of Spanish Primary School Teachers: A Novel Approach for the Study of the Teaching Profession and School Culture

    Science.gov (United States)

    Mahamud, Kira; Martínez Ruiz-Funes, María José

    2014-01-01

    This paper describes a study dealing with the reconstruction of the lives of two Spanish primary school teachers during the Franco dictatorship (1939-1975), in order to learn to what extent such a field of research can contribute to the history of education. Two family archives provide extraordinary and unique documentation to track down their…

  17. Late Pliocene Depositional History and Paleoclimate Reconstructions of the Southwest Pacific

    Science.gov (United States)

    Royce, B.; Patterson, M. O.; Pietras, J.

    2017-12-01

    Drift deposits off the eastern margin of New Zealand are important archives for the paleoclimate and paleoceanographic history of the southwest Pacific. Ocean Drilling Program (ODP) Site 1123 is located on the North Chatham rise drift just North of the westerly wind driven Subtropical Front (STF) and provides a record of near continuous sediment deposition since the Miocene along the southwest Pacific deep western boundary current (DWBC). While the Miocene and Late Pleistocene portion of this record have been well studied, the Late Pliocene record is less well developed. Southern Ocean geological records demonstrate that Late Pliocene cooling is the transient time bracketing the warmer than present Early Pliocene and bipolar glaciation at 2.7 Ma. A newly developed, robust, and astronomically tuned long-term record of benthic δ13C from ODP Site 1123 spanning the Early to Late Pliocene implies a reduction in Southern Ocean ventilation and lowering of preformed values from waters sourced along the Antarctic margin during the Late Pliocene. Thus, Late Pliocene Southern Hemisphere cooling and sea ice expansion may have drastically reduced outgassing and increased the burial of heat into the deep ocean. South Atlantic records off the west coast of Africa demonstrate an increase in the flux of iron to the open ocean during this time potentially enhancing surface ocean productivity and providing an additional cooling mechanism. Currently, atmospheric transport of dust to the Southern Ocean is dominated by persistent mid-latitude circumpolar westerly winds; this is particularly relevant for dust sourced from New Zealand. The Late Pliocene to Early Pleistocene uplift of the North Island axial ranges and South Island southern alps potentially provided a greater amount of not only sediment to the deep ocean, but also wind blow dust to the Pacific sector of the Southern Ocean. We will present a detailed high-resolution sedimentological study on the development of the Chatham

  18. An ontogenetic framework linking locomotion and trabecular bone architecture with applications for reconstructing hominin life history.

    Science.gov (United States)

    Raichlen, David A; Gordon, Adam D; Foster, Adam D; Webber, James T; Sukhdeo, Simone M; Scott, Robert S; Gosman, James H; Ryan, Timothy M

    2015-04-01

    The ontogeny of bipedal walking is considered uniquely challenging, due in part to the balance requirements of single limb support. Thus, locomotor development in humans and our bipedal ancestors may track developmental milestones including the maturation of the neuromuscular control system. Here, we examined the ontogeny of locomotor mechanics in children aged 1-8, and bone growth and development in an age-matched skeletal sample to identify bony markers of locomotor development. We show that step-to-step variation in mediolateral tibia angle relative to the vertical decreases with age, an indication that older children increase stability. Analyses of trabecular bone architecture in the distal tibia of an age-matched skeletal sample (the Norris Farms #36 archaeological skeletal collection) show a bony signal of this shift in locomotor stability. Using a grid of eleven cubic volumes of interest (VOI) in the distal metaphysis of each tibia, we show that the degree of anisotropy (DA) of trabecular struts changes with age. Intra-individual variation in DA across these VOIs is generally high at young ages, likely reflecting variation in loading due to kinematic instability. With increasing age, mean DA converges on higher values and becomes less variable across the distal tibia. We believe the ontogeny of distal tibia trabecular architecture reflects the development of locomotor stability in bipeds. We suggest this novel bony marker of development may be used to assess the relationship between locomotor development and other life history milestones in fossil hominins. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Reconstruction of the cophylogenetic history of related phylogenetic trees with divergence timing information.

    Science.gov (United States)

    Merkle, Daniel; Middendorf, Martin

    2005-04-01

    In this paper, we present a method and a corresponding tool called Tarzan for cophylogeny analysis of phylogenetic trees where the nodes are labelled with divergence timing information. The tool can be used for example to infer the common history of hosts and their parasites, of insect-plant relations or symbiotic relationships. Our method does the reconciliation analysis using an event-based concept where each event is assigned a cost and cost minimal solutions are sought. The events that are used by Tarzan are cospeciations, sortings, duplications, and (host) switches. Different from existing tools, Tarzan can handle more complex timing information of the phylogenetic trees for the analysis. This is important because several recent studies of cophylogenetic relationship have shown that timing information can be very important for the correct interpretation of results from cophylogenetic analysis. We present two examples (one host-parasite system and one insect-plant system) that show how divergence timing information can be integrated into reconciliation analysis and how this influences the results.

  20. Reconstructing disturbance history for an intensively mined region by time-series analysis of Landsat imagery.

    Science.gov (United States)

    Li, Jing; Zipper, Carl E; Donovan, Patricia F; Wynne, Randolph H; Oliphant, Adam J

    2015-09-01

    Surface mining disturbances have attracted attention globally due to extensive influence on topography, land use, ecosystems, and human populations in mineral-rich regions. We analyzed a time series of Landsat satellite imagery to produce a 28-year disturbance history for surface coal mining in a segment of eastern USA's central Appalachian coalfield, southwestern Virginia. The method was developed and applied as a three-step sequence: vegetation index selection, persistent vegetation identification, and mined-land delineation by year of disturbance. The overall classification accuracy and kappa coefficient were 0.9350 and 0.9252, respectively. Most surface coal mines were identified correctly by location and by time of initial disturbance. More than 8 % of southwestern Virginia's >4000-km(2) coalfield area was disturbed by surface coal mining over the 28-year period. Approximately 19.5 % of the Appalachian coalfield surface within the most intensively mined county (Wise County) has been disturbed by mining. Mining disturbances expanded steadily and progressively over the study period. Information generated can be applied to gain further insight concerning mining influences on ecosystems and other essential environmental features.

  1. The reconstructive study in arcaheology: case histories in the communication issues

    Directory of Open Access Journals (Sweden)

    Francesco Gabellone

    2011-09-01

    Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l

  2. Reconstruction of El Niño history from reef corals

    Directory of Open Access Journals (Sweden)

    1993-01-01

    Full Text Available RECONSTRUCTION DE L'HISTOIRE DE EL NIÑO À PARTIR DES RÉCIFS DE CORAUX. Parmi toutes les phases trouvées en milieu marin, les carbonates de diverses origines biologiques se sont révélés être la source la plus riche en information paléo-océanographique et paléo-climatique. Au cours des 20 dernières années, une méthode de travail s'est développée pour permettre l'interprétation de l'information sur les coraux des récifs d'aragonite. Ces coraux fournissent une information particulièrement utile pour leur large distribution, leur rubanement en fonction du temps et leur géochimie appropriée pour enregistrer l'information du milieu ambiant. Cet article rend compte de ces caractéristiques depuis la perspective des influences historiques du El Niño-Oscillation Austral sur l'Océan Pacifique Tropical. RECONSTRUCCIÓN DE LA HISTORIA DE EL NIÑO A PARTIR DE ARRECIFES CORALINOS. De todas las fases minerales encontradas en el ambiente marino, los carbonatos de varios orígenes biológicos han demostrado ser el repositorio más rico de información paleoceanográfica y paleoclimática. Durante las dos últimas décadas, se ha desarrollado un método de trabajo para permitir la interpretación de la información en los arrecifes de corales aragoníticos. Estos corales proveen un archivo especialmente útil por su amplia distribución, su bandeado temporal y su geoquímica apropiada para registrar información sobre el medio ambiente. Este artículo revisa estos atributos desde la perspectiva de las influencias históricas de El Niño-Oscilación Austral sobre el océano Pacífico tropical. Of all the mineral phases found in the marine environment, carbonates of various biological origins have proven to be the richest repository of paleoceanographic / paleoclimatic information. Over the last two decades, a process framework has evolved to permit the interpretation of such information in aragonitic reef corals. Corals comprise a

  3. Bayesian integration of radioisotope dating (210Pb, 137Cs, 241Am, 14C) and an 18-20th century mining history of Brotherswater, English Lake District

    Science.gov (United States)

    Schillereff, Daniel; Chiverrell, Richard; Macdonald, Neil; Hooke, Janet; Welsh, Katharine; Piliposyan, Gayane; Appleby, Peter

    2014-05-01

    Lake sediment records are often a useful tool for investigating landscape evolution as geomorphic changes in the catchment are reflected by altered sediment properties in the material transported through the watershed and deposited at the lake bed. Recent research at Brotherswater, an upland waterbody in the Lake District, northwest England, has focused on reconstructing historical floods from their sedimentary signatures and calculating long-term sediment and carbon budgets from fourteen sediment cores extracted from across the basin. Developing accurate chronological control is essential for these tasks. One sediment core (BW11-2; 3.5 m length) from the central basin has been dated using artificial radionuclide measurements (210Pb, 137Cs, 241Am) for the uppermost sediments and radiocarbon (14C) for lower sediments. The core appears to span the past 1500 years, however a number of problems have arisen. We present our explanations for these errors, the independent chronological techniques used to generate an accurate age-depth model for this core and methods for its transferral to the other 13 cores extracted from the basin. Two distinct 137Cs markers, corresponding to the 1986 Chernobyl disaster and 1960s weapons testing, confirm the 210Pb profile for sediment deposition since ~1950, but calculations prior to this appear erroneous, possibly due to a hiatus in the sediment record. We used high-resolution geochemical profiles (measured by XRF) to cross-correlate with a second 210Pb-dated chronology from a more distal location, which returned more sensible results. Unfortunately, the longer 14C sequence exhibits two age-reversals (radiocarbon dates that are too old). We believe the uppermost two dates are erroneous, due to a shift in inflow location as a flood prevention method ~1900 A.D., dated using information from historical maps. The lower age-reversal coincides with greater supply of terrigenous material to the lake (increased Zr, K, Ti concentrations

  4. xenoGI: reconstructing the history of genomic island insertions in clades of closely related bacteria.

    Science.gov (United States)

    Bush, Eliot C; Clark, Anne E; DeRanek, Carissa A; Eng, Alexander; Forman, Juliet; Heath, Kevin; Lee, Alexander B; Stoebel, Daniel M; Wang, Zunyan; Wilber, Matthew; Wu, Helen

    2018-02-05

    Genomic islands play an important role in microbial genome evolution, providing a mechanism for strains to adapt to new ecological conditions. A variety of computational methods, both genome-composition based and comparative, have been developed to identify them. Some of these methods are explicitly designed to work in single strains, while others make use of multiple strains. In general, existing methods do not identify islands in the context of the phylogeny in which they evolved. Even multiple strain approaches are best suited to identifying genomic islands that are present in one strain but absent in others. They do not automatically recognize islands which are shared between some strains in the clade or determine the branch on which these islands inserted within the phylogenetic tree. We have developed a software package, xenoGI, that identifies genomic islands and maps their origin within a clade of closely related bacteria, determining which branch they inserted on. It takes as input a set of sequenced genomes and a tree specifying their phylogenetic relationships. Making heavy use of synteny information, the package builds gene families in a species-tree-aware way, and then attempts to combine into islands those families whose members are adjacent and whose most recent common ancestor is shared. The package provides a variety of text-based analysis functions, as well as the ability to export genomic islands into formats suitable for viewing in a genome browser. We demonstrate the capabilities of the package with several examples from enteric bacteria, including an examination of the evolution of the acid fitness island in the genus Escherichia. In addition we use output from simulations and a set of known genomic islands from the literature to show that xenoGI can accurately identify genomic islands and place them on a phylogenetic tree. xenoGI is an effective tool for studying the history of genomic island insertions in a clade of microbes. It identifies

  5. Reconstructing Watershed History from Reservoir Stratigraphy: Englebright Lake, Yuba River, Northern California

    Science.gov (United States)

    Snyder, N. P.; Alpers, C. N.; Childs, J. R.; Curtis, J. A.; Flint, L. E.; Holmes, C. W.; Rubin, D. M.; Wright, S. A.

    2004-12-01

    Reservoirs provide the opportunity to study fluvial processes and rates in a controlled setting because they are effective traps of sediment and are often well monitored. An extensive sediment coring and sampling campaign was done in Englebright Lake on the Yuba River in northern California as part of a fish-habitat restoration study. The Yuba watershed (particularly the southern part) was the site of intensive hydraulic gold mining in the 19th and early 20th century, and Englebright Dam was built in 1940 to trap mining debris. Results of a bathymetric survey in 2001 indicate that the reservoir was 26% full (22x106 m3 of material). The physical properties of the entire deposit were extrapolated from ˜300 m of cores collected at 7 sites along the longitudinal axis of the reservoir in 2002. The mass of the deposit is 26x106 metric tons, of which 3.2% is organic. The sediment is ˜65% sand and gravel, and distinct layers of differing grain size (sand-gravel, silt-clay, organics) are well preserved in the cores. The depositional chronology of the reservoir was established using 137Cs analysis and the relations between the cored stratigraphy and the hydrologic and impoundment history of the watershed. Deposits from three major flood events (1955, 1964, 1997; each with discharge >3,400 m3/s) were identified in the stratigraphy of most of the coring sites. Observations of recent (post-1997) depositional patterns are guiding the development of a conceptual model of reservoir-sedimentation processes during floods, drawdowns, and intraflood periods. Enlargement of an upstream dam on the North Yuba River in 1970 caused a decrease in flood frequency in the Yuba River and changed management of Englebright Lake (ending annual drawdowns). A relict topset-foreset-bottomset sequence observed in the cored stratigraphy is interpreted to correlate with this change in watershed management; a second deltaic sequence was deposited on top of the first after 1970. Post-1970 average annual

  6. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  7. Role of stable isotope analyses in reconstructing past life-histories and the provenancing human skeletal remains: a review

    Directory of Open Access Journals (Sweden)

    Sehrawat Jagmahender Singh

    2017-09-01

    Full Text Available This article reviews the present scenario of use of stable isotopes (mainly δ13C, δ15N, δ18O, 87Sr to trace past life behaviours like breast feeding and weaning practices, the geographic origin, migration history, paleodiet and subsistence patterns of past populations from the chemical signatures of isotopes imprinted in human skeletal remains. This approach is based on the state that food-web isotopic signatures are seen in the human bones and teeth and such signatures can change parallely with a variety of biogeochemical processes. By measuring δ13C and δ15N isotopic values of subadult tissues of different ages, the level of breast milk ingestion at particular ages and the components of the complementary foods can be assessed. Strontium and oxygen isotopic analyses have been used for determining the geographic origins and reconstructing the way of life of past populations as these isotopes can map the isotopic outline of the area from where the person acquired water and food during initial lifetime. The isotopic values of strontium and oxygen values are considered specific to geographical areas and serve as reliable chemical signatures of migration history of past human populations (local or non-local to the site. Previous isotopic studies show that the subsistence patterns of the past human populations underwent extensive changes from nomadic to complete agricultural dependence strategies. The carbon and nitrogen isotopic values of local fauna of any archaeological site can be used to elucidate the prominence of freshwater resources in the diet of the past human populations found near the site. More extensive research covering isotopic descriptions of various prehistoric, historic and modern populations is needed to explore the role of stable isotope analysis for provenancing human skeletal remains and assessing human migration patterns/routes, geographic origins, paleodiet and subsistence practices of past populations.

  8. Cenozoic basin thermal history reconstruction and petroleum systems in the eastern Colombian Andes

    Science.gov (United States)

    Parra, Mauricio; Mora, Andres; Ketcham, Richard A.; Stockli, Daniel F.; Almendral, Ariel

    2017-04-01

    Late Mesozoic-Cenozoic retro-arc foreland basins along the eastern margin of the Andes in South America host the world's best detrital record for the study of subduction orogenesis. There, the world's most prolific petroleum system occur in the northernmost of these foreland basin systems, in Ecuador, Colombia and Venezuela, yet over 90% of the discovered hydrocarbons there occur in one single province in norteastern Venezuela. A successful industry-academy collaboration applied a multidisciplinary approach to the study of the north Andes with the aim of investigating both, the driving mechanisms of orogenesis, and its impact on hydrocarbon accumulation in eastern Colombia. The Eastern Cordillera is an inversion orogen located at the leading edge of the northern Andes. Syn-rift subsidence favored the accumulation of km-thick organic matter rich shales in a back-arc basin in the early Cretaceous. Subsequent late Cretaceous thermal subsidence prompted the accumulation of shallow marine sandstones and shales, the latter including the Turonian-Cenomanian main hydrocarbon source-rock. Early Andean uplift since the Paleocene led to development of a flexural basin, filled with mainly non-marine strata. We have studied the Meso-Cenozoic thermal evolution of these basins through modeling of a large thermochronometric database including hundreds of apatite and zircon fission-track and (U-Th)/He data, as well as paleothermometric information based on vitrinite reflectance and present-day temperatures measured in boreholes. The detrital record of Andean construction was also investigated through detrital zircon U-Pb geochronometry in outcrop and borehole samples. A comprehensive burial/exhumation history has been accomplished through three main modeling strategies. First, one-dimensional subsidence was used to invert the pre-extensional lithospheric thicknesses, the magnitude of stretching, and the resulting heat flow associated to extension. The amount of eroded section and

  9. Reconstruction of the pollution history of alkylphenols (4-tert-octylphenol, 4-nonylphenol) in the Baltic Sea.

    Science.gov (United States)

    Graca, Bożena; Staniszewska, Marta; Zakrzewska, Danuta; Zalewska, Tamara

    2016-06-01

    This paper reports the reconstruction of the pollution history of 4-tert-octylphenol (OP) and 4-nonylphenol (NP) in the Baltic Sea. Alkylphenols are endocrine-disrupting compound and therefore toxic to aquatic organisms. Sediment cores were collected from regions with relatively stable sedimentation conditions. The cores were dated by the (210)Pb method. The OP and NP were determined using HPLC-FL. The highest inventory of these compounds was observed in the Gotland Deep (610 μg m(2) of NP and 47 μg m(2) of OP) and the lowest-on the slope of the Gdansk Deep (24 μg m(2) of NP and 16 μg m(2) of OP). Such spatial distribution was probably, among other factors, the result of the uplift of the sea floor. The pollution trends of OP and NP in sediments coincided with the following: (1) the beginnings of eutrophication (1960s/1970s of the twentieth century) and (2) strong increase in the areal extent and volume of hypoxia and anoxia in the Baltic (present century).

  10. PROBING THE EXPANSION HISTORY OF THE UNIVERSE BY MODEL-INDEPENDENT RECONSTRUCTION FROM SUPERNOVAE AND GAMMA-RAY BURST MEASUREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Chao-Jun; Li, Xin-Zhou, E-mail: fengcj@shnu.edu.cn, E-mail: kychz@shnu.edu.cn [Shanghai United Center for Astrophysics (SUCA), Shanghai Normal University, 100 Guilin Road, Shanghai 200234 (China)

    2016-04-10

    To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.

  11. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  12. Reconstructing land use history from Landsat time-series. Case study of a swidden agriculture system in Brazil

    Science.gov (United States)

    Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert

    2016-05-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified

  13. Reconstructing Land Use History from Landsat Time-Series. Case study of Swidden Agriculture Intensification in Brazil

    Science.gov (United States)

    Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.

    2015-12-01

    We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for

  14. Reconstructing the Evolutionary History of Powdery Mildew Lineages (Blumeria graminis) at Different Evolutionary Time Scales with NGS Data.

    Science.gov (United States)

    Menardo, Fabrizio; Wicker, Thomas; Keller, Beat

    2017-02-01

    Blumeria graminis (Ascomycota) includes fungal pathogens that infect numerous grasses and cereals. Despite its economic impact on agriculture and its scientific importance in plant-pathogen interaction studies, the evolution of different lineages with different host ranges is poorly understood. Moreover, the taxonomy of grass powdery mildew is rather exceptional: there is only one described species (B. graminis) subdivided in different formae speciales (ff.spp.), which are defined by their host range. In this study we applied phylogenomic and population genomic methods to whole genome sequence data of 31 isolates of B. graminis belonging to different ff.spp. and reconstructed the evolutionary relationships between different lineages. The results of the phylogenomic analysis support a pattern of co-evolution between some of the ff.spp. and their host plant. In addition, we identified exceptions to this pattern, namely host jump events and the recent radiation of a clade less than 280,000 years ago. Furthermore, we found a high level of gene tree incongruence localized in the youngest clade. To distinguish between incomplete lineage sorting and lateral gene flow, we applied a coalescent-based method of demographic inference and found evidence of horizontal gene flow between recently diverged lineages. Overall we found that different processes shaped the diversification of B. graminis, co-evolution with the host species, host jump and fast radiation. Our study is an example of how genomic data can resolve complex evolutionary histories of cryptic lineages at different time scales, dealing with incomplete lineage sorting and lateral gene flow. © The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  15. Bayesian methods outperform parsimony but at the expense of precision in the estimation of phylogeny from discrete morphological data.

    Science.gov (United States)

    O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J

    2016-04-01

    Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. © 2016 The Authors.

  16. Bayesian Compressed Sensing with Unknown Measurement Noise Level

    DEFF Research Database (Denmark)

    Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand

    2013-01-01

    In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction a...

  17. A decision-tree-based method for reconstructing disturbance history in the Russia boreal forests over 30 years

    Science.gov (United States)

    Chen, D.; Loboda, T. V.

    2012-12-01

    The boreal forest is one of the largest biomes on Earth and carries crucial significance in numerous aspects. Located in the high latitude region of the Northern Hemisphere, it is predicted that the boreal forest is subject to the highest level of influence under the changing climate, which may impose profound impacts on the global carbon and energy budget. Of the entire boreal biome, approximately two thirds consists of the Russian boreal forest, which is also the largest forested zone in the world. Fire and logging have been the predominant disturbance types in the Russian boreal forest, which accelerate the speed of carbon release into the atmosphere. To better understand these processes, records of past disturbance are in great need. However, there has been no comprehensive and unbiased multi-decadal record of forest disturbance in this region. This paper illustrates a method for reconstructing disturbance history in the Russia boreal forests over 30 years. This method takes advantage of data from both Landsat, which has a long data record but limited spatial coverage, and the Moderate Resolution Spectroradiometer (MODIS), which has wall-to-wall spatial coverage but limited period of observations. We developed a standardized and semi-automated approach to extract training and validation data samples from Landsat imagery. Landsat data, dating back to 1984, were used to generate maps of forest disturbance using temporal shifts in Disturbance Index through the multi-temporal stack of imagery in selected locations. The disturbed forests are attributed to logging or burning causes by means of visual examination. The Landsat-based disturbance maps are then used as reference data to train a decision tree classifier on 2003 MODIS data. This classifier utilizes multiple direct MODIS products including the BRDF-adjusted surface reflectance, a suite of vegetation indices, and land surface temperature. The algorithm also capitalizes on seasonal variability in class

  18. Reconstruction of the Evolutionary History and Dispersal of Usutu Virus, a Neglected Emerging Arbovirus in Europe and Africa.

    Science.gov (United States)

    Engel, Dimitri; Jöst, Hanna; Wink, Michael; Börstler, Jessica; Bosch, Stefan; Garigliany, Mutien-Marie; Jöst, Artur; Czajka, Christina; Lühken, Renke; Ziegler, Ute; Groschup, Martin H; Pfeffer, Martin; Becker, Norbert; Cadar, Daniel; Schmidt-Chanasit, Jonas

    2016-02-02

    Usutu virus (USUV), one of the most neglected Old World encephalitic flaviviruses, causes epizootics among wild and captive birds and sporadic infection in humans. The dynamics of USUV spread and evolution in its natural hosts are unknown. Here, we present the phylogeny and evolutionary history of all available USUV strains, including 77 newly sequenced complete genomes from a variety of host species at a temporal and spatial scaled resolution. The results showed that USUV can be classified into six distinct lineages and that the most recent common ancestor of the recent European epizootics emerged in Africa at least 500 years ago. We demonstrated that USUV was introduced regularly from Africa into Europe in the last 50 years, and the genetic diversity of European lineages is shaped primarily by in situ evolution, while the African lineages have been driven by extensive gene flow. Most of the amino acid changes are deleterious polymorphisms removed by purifying selection, with adaptive evolution restricted to the NS5 gene and several others evolving under episodic directional selection, indicating that the ecological or immunological factors were mostly the key determinants of USUV dispersal and outbreaks. Host-specific mutations have been detected, while the host transition analysis identified mosquitoes as the most likely origin of the common ancestor and birds as the source of the recent European USUV lineages. Our results suggest that the major migratory bird flyways could predict the continental and intercontinental dispersal patterns of USUV and that migratory birds might act as potential long-distance dispersal vehicles. Usutu virus (USUV), a mosquito-borne flavivirus of the Japanese encephalitis virus antigenic group, caused massive bird die-offs, mostly in Europe. There is increasing evidence that USUV appears to be pathogenic for humans, becoming a potential public health problem. The emergence of USUV in Europe allows us to understand how an arbovirus

  19. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  20. A parametric reconstruction of the deceleration parameter

    Energy Technology Data Exchange (ETDEWEB)

    Al Mamon, Abdulla [Manipal University, Manipal Centre for Natural Sciences, Manipal (India); Visva-Bharati, Department of Physics, Santiniketan (India); Das, Sudipta [Visva-Bharati, Department of Physics, Santiniketan (India)

    2017-07-15

    The present work is based on a parametric reconstruction of the deceleration parameter q(z) in a model for the spatially flat FRW universe filled with dark energy and non-relativistic matter. In cosmology, the parametric reconstruction technique deals with an attempt to build up a model by choosing some specific evolution scenario for a cosmological parameter and then estimate the values of the parameters with the help of different observational datasets. In this paper, we have proposed a logarithmic parametrization of q(z) to probe the evolution history of the universe. Using the type Ia supernova, baryon acoustic oscillation and the cosmic microwave background datasets, the constraints on the arbitrary model parameters q{sub 0} and q{sub 1} are obtained (within 1σ and 2σ confidence limits) by χ{sup 2}-minimization technique. We have then reconstructed the deceleration parameter, the total EoS parameter ω{sub tot}, the jerk parameter and have compared the reconstructed results of q(z) with other well-known parametrizations of q(z). We have also shown that two model selection criteria (namely, the Akaike information criterion and Bayesian information criterion) provide a clear indication that our reconstructed model is well consistent with other popular models. (orig.)

  1. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  2. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  3. Testing Pixel Translation Digital Elevation Models to Reconstruct Slip Histories: An Example from the Agua Blanca Fault, Baja California, Mexico

    Science.gov (United States)

    Wilson, J.; Wetmore, P. H.; Malservisi, R.; Ferwerda, B. P.; Teran, O.

    2012-12-01

    We use recently collected slip vector and total offset data from the Agua Blanca fault (ABF) to constrain a pixel translation digital elevation model (DEM) to reconstruct the slip history of this fault. This model was constructed using a Perl script that reads a DEM file (Easting, Northing, Elevation) and a configuration file with coordinates that define the boundary of each fault segment. A pixel translation vector is defined as a magnitude of lateral offset in an azimuthal direction. The program translates pixels north of the fault and prints their pre-faulting position to a new DEM file that can be gridded and displayed. This analysis, where multiple DEMs are created with different translation vectors, allows us to identify areas of transtension or transpression while seeing the topographic expression in these areas. The benefit of this technique, in contrast to a simple block model, is that the DEM gives us a valuable graphic which can be used to pose new research questions. We have found that many topographic features correlate across the fault, i.e. valleys and ridges, which likely have implications for the age of the ABF, long term landscape evolution rates, and potentially provide conformation for total slip assessments The ABF of northern Baja California, Mexico is an active, dextral strike slip fault that transfers Pacific-North American plate boundary strain out of the Gulf of California and around the "Big Bend" of the San Andreas Fault. Total displacement on the ABF in the central and eastern parts of the fault is 10 +/- 2 km based on offset Early-Cretaceous features such as terrane boundaries and intrusive bodies (plutons and dike swarms). Where the fault bifurcates to the west, the northern strand (northern Agua Blanca fault or NABF) is constrained to 7 +/- 1 km. We have not yet identified piercing points on the southern strand, the Santo Tomas fault (STF), but displacement is inferred to be ~4 km assuming that the sum of slip on the NABF and STF is

  4. The Development of Bayesian Theory and Its Applications in Business and Bioinformatics

    Science.gov (United States)

    Zhang, Yifei

    2018-03-01

    Bayesian Theory originated from an Essay of a British mathematician named Thomas Bayes in 1763, and after its development in 20th century, Bayesian Statistics has been taking a significant part in statistical study of all fields. Due to the recent breakthrough of high-dimensional integral, Bayesian Statistics has been improved and perfected, and now it can be used to solve problems that Classical Statistics failed to solve. This paper summarizes Bayesian Statistics’ history, concepts and applications, which are illustrated in five parts: the history of Bayesian Statistics, the weakness of Classical Statistics, Bayesian Theory and its development and applications. The first two parts make a comparison between Bayesian Statistics and Classical Statistics in a macroscopic aspect. And the last three parts focus on Bayesian Theory in specific -- from introducing some particular Bayesian Statistics’ concepts to listing their development and finally their applications.

  5. Traumatic brain injury and alcohol/substance abuse: A Bayesian meta-analysis comparing the outcomes of people with and without a history of abuse.

    Science.gov (United States)

    Unsworth, David J; Mathias, Jane L

    2017-08-01

    Alcohol and substance (drugs and/or alcohol) abuse are major risk factors for traumatic brain injury (TBI); however, it remains unclear whether outcomes differ for those with and without a history of preinjury abuse. A meta-analysis was performed to examine this issue. The PubMed, Embase, and PsycINFO databases were searched for research that compared the neuroradiological, cognitive, or psychological outcomes of adults with and without a documented history of alcohol and/or substance abuse who sustained nonpenetrating TBIs. Data from 22 studies were analyzed using a random-effects model: Hedges's g effect sizes measured the mean difference in outcomes of individuals with/without a history of preinjury abuse, and Bayes factors assessed the probability that the outcomes differed. Patients with a history of alcohol and/or substance abuse had poorer neuroradiological outcomes, including reduced hippocampal (g = -0.82) and gray matter volumes (g = -0.46 to -0.82), and enlarged cerebral ventricles (g = -0.73 to -0.80). There were limited differences in cognitive outcomes: Executive functioning (g = -0.51) and memory (g = -0.39 to -0.43) were moderately affected, but attention and reasoning were not. The findings for fine motor ability, construction, perception, general cognition, and language were inconclusive. Postinjury substance and alcohol use (g = -0.97 to -1.07) and emotional functioning (g = -0.29 to -0.44) were worse in those with a history of alcohol and/or substance abuse (psychological outcomes). This study highlighted the type and extent of post-TBI differences between persons with and without a history of alcohol or substance abuse, many of which may hamper recovery. However, variation in the criteria for premorbid abuse, limited information regarding the history of abuse, and an absence of preinjury baseline data prevented an assessment of whether the differences predated the TBI, occurred as a result of ongoing alcohol/substance abuse, or

  6. 210Pb-derived ages for the reconstruction of terrestrial contaminant history into the Mexican Pacific coast: Potential and limitations

    International Nuclear Information System (INIS)

    Ruiz-Fernandez, A.C.; Hillaire-Marcel, C.

    2009-01-01

    210 Pb is widely used for dating recent sediments in the aquatic environment; however, our experiences working in shallow coastal environments in the Pacific coast of Mexico have demonstrated that the potential of 210 Pb for reliable historical reconstructions might be limited by the low 210 Pb atmospheric fallout, sediment mixing, abundance of coarse sediments and the lack of 137 Cs signal for 210 Pb corroboration. This work discusses the difficulties in obtaining adequate sedimentary records for geochronological reconstruction in such active and complex settings, including examples of 210 Pb geochronologies based on sediment profiles collected in two contrasting areas coastal areas (mudflats associated to coastal lagoons of Sinaloa State and the continental shelf of the Gulf of Tehuantepec), in which geochemical data was used to support the temporal frame established and the changes in sediment supply recorded in the sediment cores which were related to the development of land-based activities during the last century.

  7. Mercury contamination history of an estuarine floodplain reconstructed from a 210Pb-dated sediment core (Berg River, South Africa)

    CSIR Research Space (South Africa)

    Kading, TJ

    2009-01-01

    Full Text Available Mercury deposition histories have been scarcely documented in the southern hemisphere. A sediment core was collected from the ecologically important estuarine floodplain of the Berg River (South Africa). The authors establish the concentration of Hg...

  8. Reconstructing the phylogenetic history of long-term effective population size and life-history traits using patterns of amino acid replacement in mitochondrial genomes of mammals and birds.

    Science.gov (United States)

    Nabholz, Benoit; Uwimana, Nicole; Lartillot, Nicolas

    2013-01-01

    The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more

  9. Constraining the Deforestation History of Europe: Evaluation of Historical Land Use Scenarios with Pollen-Based Land Cover Reconstructions

    Directory of Open Access Journals (Sweden)

    Jed O. Kaplan

    2017-12-01

    Full Text Available Anthropogenic land cover change (ALCC is the most important transformation of the Earth system that occurred in the preindustrial Holocene, with implications for carbon, water and sediment cycles, biodiversity and the provision of ecosystem services and regional and global climate. For example, anthropogenic deforestation in preindustrial Eurasia may have led to feedbacks to the climate system: both biogeophysical, regionally amplifying winter cold and summer warm temperatures, and biogeochemical, stabilizing atmospheric CO 2 concentrations and thus influencing global climate. Quantification of these effects is difficult, however, because scenarios of anthropogenic land cover change over the Holocene vary widely, with increasing disagreement back in time. Because land cover change had such widespread ramifications for the Earth system, it is essential to assess current ALCC scenarios in light of observations and provide guidance on which models are most realistic. Here, we perform a systematic evaluation of two widely-used ALCC scenarios (KK10 and HYDE3.1 in northern and part of central Europe using an independent, pollen-based reconstruction of Holocene land cover (REVEALS. Considering that ALCC in Europe primarily resulted in deforestation, we compare modeled land use with the cover of non-forest vegetation inferred from the pollen data. Though neither land cover change scenario matches the pollen-based reconstructions precisely, KK10 correlates well with REVEALS at the country scale, while HYDE systematically underestimates land use with increasing magnitude with time in the past. Discrepancies between modeled and reconstructed land use are caused by a number of factors, including assumptions of per-capita land use and socio-cultural factors that cannot be predicted on the basis of the characteristics of the physical environment, including dietary preferences, long-distance trade, the location of urban areas and social organization.

  10. Niche partitioning in sympatric Gorilla and Pan from Cameroon: implications for life history strategies and for reconstructing the evolution of hominin life history.

    Science.gov (United States)

    Macho, Gabriele A; Lee-Thorp, Julia A

    2014-01-01

    Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%), relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature) until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a) more gradual and (b) earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the results highlight

  11. Vegetation history reconstructed from anthracology and pollen analysis at the rescue excavation of the MO Motorway, Hungary

    Science.gov (United States)

    Náfrádi, Katalin; Bodor, Elvira; Törőcsik, Tünde; Sümegi, Pál

    2011-12-01

    The significance of geoarchaeological investigations is indisputable in reconstructing the former environment and in studying the relationship between humans and their surroundings. Several disciplines have developed during the last few decades to give insight into earlier time periods and their climatic conditions (e.g. palynology, malacology, archaeobotany, phytology and animal osteology). Charcoal and pollen analytical studies from the rescue excavation of the MO motorway provide information about the vegetation changes of the past. These methods are used to reconstruct the environment of the former settlements and to detect the human impact and natural climatic changes. The sites examined span the periods of the Late-Copper Age, Late-Bronze Age, Middle-Iron Age, Late-Iron Age, Sarmatian period, Late Sarmatian period, Migration period, Late-Migration period and Middle Ages. The vegetation before the Copper Age is based only on pollen analytical data. Anthracological results show the overall dominance of Quercus and a great number of Ulmus, Fraxinus, Acer, Fagus, Alnus and Populus/Salix tree fossils, as well as the residues of fruit trees present in the charred wood assemblage.

  12. Niche partitioning in sympatric Gorilla and Pan from Cameroon: implications for life history strategies and for reconstructing the evolution of hominin life history.

    Directory of Open Access Journals (Sweden)

    Gabriele A Macho

    Full Text Available Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%, relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a more gradual and (b earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the

  13. New Routes to Phylogeography: A Bayesian Structured Coalescent Approximation.

    Science.gov (United States)

    De Maio, Nicola; Wu, Chieh-Hsi; O'Reilly, Kathleen M; Wilson, Daniel

    2015-08-01

    Phylogeographic methods aim to infer migration trends and the history of sampled lineages from genetic data. Applications of phylogeography are broad, and in the context of pathogens include the reconstruction of transmission histories and the origin and emergence of outbreaks. Phylogeographic inference based on bottom-up population genetics models is computationally expensive, and as a result faster alternatives based on the evolution of discrete traits have become popular. In this paper, we show that inference of migration rates and root locations based on discrete trait models is extremely unreliable and sensitive to biased sampling. To address this problem, we introduce BASTA (BAyesian STructured coalescent Approximation), a new approach implemented in BEAST2 that combines the accuracy of methods based on the structured coalescent with the computational efficiency required to handle more than just few populations. We illustrate the potentially severe implications of poor model choice for phylogeographic analyses by investigating the zoonotic transmission of Ebola virus. Whereas the structured coalescent analysis correctly infers that successive human Ebola outbreaks have been seeded by a large unsampled non-human reservoir population, the discrete trait analysis implausibly concludes that undetected human-to-human transmission has allowed the virus to persist over the past four decades. As genomics takes on an increasingly prominent role informing the control and prevention of infectious diseases, it will be vital that phylogeographic inference provides robust insights into transmission history.

  14. Progress on Bayesian Inference of the Fast Ion Distribution Function

    DEFF Research Database (Denmark)

    Stagner, L.; Heidbrink, W.W,; Chen, X.

    2013-01-01

    . However, when theory and experiment disagree (for one or more diagnostics), it is unclear how to proceed. Bayesian statistics provides a framework to infer the DF, quantify errors, and reconcile discrepant diagnostic measurements. Diagnostic errors and weight functions that describe the phase space...... sensitivity of the measurements are incorporated into Bayesian likelihood probabilities. Prior probabilities describe physical constraints. This poster will show reconstructions of classically described, low-power, MHD-quiescent distribution functions from actual FIDA measurements. A description of the full...

  15. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  16. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  17. Reconstructing the sedimentation history of the Bengal Delta Plain by means of geochemical and stable isotopic data

    International Nuclear Information System (INIS)

    Neidhardt, Harald; Biswas, Ashis; Freikowski, Dominik; Majumder, Santanu; Chatterjee, Debashis; Berner, Zsolt A.

    2013-01-01

    The purpose of this study is to examine the sedimentation history of the central floodplain area of the Bengal Delta Plain in West Bengal, India. Sediments from two boreholes were analyzed regarding lithology, geochemistry and the stable isotopic composition of embedded organic matter. Different lithofacies were distinguished that reflect frequent changes in the prevailing sedimentary depositional environment of the study area. The lowest facies comprises poorly sorted fluvial sediments composed of fine gravel to clay pointing at high transport energy and intense relocation processes. This facies is considered to belong to an early Holocene lowstand systems tract that followed the last glacial maximum. Fine to medium sands above it mark a gradual change towards a transgressive systems tract. Upwards increasing proportions of silt and the stable isotopic composition of embedded organic matter both indicate a gradual change from fluvial channel infill sediments towards more estuarine and marine influenced deposits. Youngest sediments are composed of clayey and silty overbank deposits of the Hooghly River that have formed a vast low-relief delta-floodplain. Close to the surface, small concretions of secondary Mn-oxides and Fe-(oxyhydr)oxides occur and mark the fluctuation range of the unsaturated zone. These concretions are accompanied by relatively high contents of trace elements such as Zn, Ni, Cu, and As. To sum up, the outcomes of this study provide new insights into the complex sedimentation history of the barely investigated central floodplain area of West Bengal

  18. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  19. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    International Nuclear Information System (INIS)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida

    2011-01-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, k 0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  20. Contribution of analytical nuclear techniques in the reconstruction of the Brazilian pre-history analysing archaeological ceramics of Tupiguarani tradition

    Energy Technology Data Exchange (ETDEWEB)

    Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida, E-mail: menezes@cdtn.br, E-mail: cida@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Reator e Tecnicas Analiticas. Laboratorio de Ativacao Neutronica; Sabino, Claudia de V.S. [PUC-Minas, Belo Horizonte, MG (Brazil)

    2011-07-01

    Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, {sub k}0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)

  1. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  2. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  3. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  4. Combining Livestock Production Information in a Process-Based Vegetation Model to Reconstruct the History of Grassland Management

    Science.gov (United States)

    Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; hide

    2016-01-01

    Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5deg by 0.5deg. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 19012012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, risingCO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 x 10(exp 6) km(exp 2) in 1901 to 12.3 x 10(exp 6) kmI(exp 2) in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and inter-annual variability of grassland productivity at global

  5. Cryogenic brines as diagenetic fluids: Reconstructing the diagenetic history of the Victoria Land Basin using clumped isotopes

    Science.gov (United States)

    Staudigel, Philip T.; Murray, Sean; Dunham, Daniel P.; Frank, Tracy D.; Fielding, Christopher R.; Swart, Peter K.

    2018-03-01

    The isotopic analyses (δ13C, δ18O, and Δ47) of carbonate phases recovered from a core in McMurdo Sound by ANtarctic geologic DRILLing (ANDRILL-2A) indicate that the majority of secondary carbonate mineral formation occurred at cooler temperatures than the modern burial temperature, and in the presence of fluids with δ18Owater values ranging between -11 and -6‰ VSMOW. These fluids are interpreted as being derived from a cryogenic brine formed during the freezing of seawater. The Δ47 values were converted to temperature using an in-house calibration presented in this paper. Measurements of the Δ47 values in the cements indicate increasingly warmer crystallization temperatures with depth and, while roughly parallel to the observed geothermal gradient, consistently translate to temperatures that are cooler than the current burial temperature. The difference in temperature suggests that cements formed when they were ∼260 ± 100 m shallower than at the present day. This depth range corresponds to a period of minimal sediment accumulation from 3 to 11 Myr; it is therefore interpreted that the majority of cements formed during this time. This behavior is also predicted by time-integrated modeling of cementation at this site. If this cementation had occurred in the presence of these fluids, then the cryogenic brines have been a longstanding feature in the Victoria Land Basin. Brines such as those found at this site have been described in numerous modern high-latitude settings, and analogous fluids could have played a role in the diagenetic history of other ice-proximal sediments and basins during glacial intervals throughout geologic history. The agreement between the calculated δ18Owater value and the measured values in the pore fluids shows how the Δ47 proxy can be used to identify the origin of negative δ18O values in carbonate rocks and that extremely negative values do not necessarily need to be a result of the influence of meteoric fluids or reaction at

  6. Seeing is believing: Visualization of He distribution in zircon and implications for thermal history reconstruction on single crystals.

    Science.gov (United States)

    Danišík, Martin; McInnes, Brent I A; Kirkland, Christopher L; McDonald, Brad J; Evans, Noreen J; Becker, Thomas

    2017-02-01

    Zircon (U-Th)/He thermochronometry is an established radiometric dating technique used to place temporal constraints on a range of thermally sensitive geological events, such as crustal exhumation, volcanism, meteorite impact, and ore genesis. Isotopic, crystallographic, and/or mineralogical heterogeneities within analyzed grains can result in dispersed or anomalous (U-Th)/He ages. Understanding the effect of these grain-scale phenomena on the distribution of He in analyzed minerals should lead to improvements in data interpretation. We combine laser ablation microsampling and noble gas and trace element mass spectrometry to provide the first two-dimensional, grain-scale zircon He "maps" and quantify intragrain He distribution. These maps illustrate the complexity of intracrystalline He distribution in natural zircon and, combined with a correlated quantification of parent nuclide (U and Th) distribution, provide an opportunity to assess a number of crystal chemistry processes that can generate anomalous zircon (U-Th)/He ages. The technique provides new insights into fluid inclusions as potential traps of radiogenic He and confirms the effect of heterogeneity in parent-daughter isotope abundances and metamictization on (U-Th)/He systematics. Finally, we present a new inversion method where the He, U, and Th mapping data can be used to constrain the high- and low-temperature history of a single zircon crystal.

  7. Holocene history and environmental reconstruction of a Hercynian mire and surrounding mountain landscape based on multiple proxies

    Science.gov (United States)

    Dudová, Lydie; Hájková, Petra; Opravilová, Věra; Hájek, Michal

    2014-07-01

    We discovered the first peat section covering the entire Holocene in the Hrubý Jeseník Mountains, representing an island of unique alpine vegetation whose history may display transitional features between the Hercynian and Carpathian regions. We analysed pollen, plant macrofossils (more abundant in bottom layers), testate amoebae (more abundant in upper layers), peat stratigraphy and chemistry. We found that the landscape development indeed differed from other Hercynian mountains located westward. This is represented by Pinus cembra and Larix during the Pleistocene/Holocene transition, the early expansion of spruce around 10,450 cal yr BP, and survival of Larix during the climatic optimum. The early Holocene climatic fluctuations are traced in our profile by species compositions of both the mire and surrounding forests. The mire started to develop as a calcium-rich percolation fen with some species recently considered to be postglacial relicts (Meesia triquetra, Betula nana), shifted into ombrotrophy around 7450 cal yr BP by autogenic succession and changed into a pauperised, nutrient-enriched spruce woodland due to modern forestry activities. We therefore concluded that its recent vegetation is not a product of natural processes. From a methodological viewpoint we demonstrated how using multiple biotic proxies and extensive training sets in transfer functions may overcome taphonomic problems.

  8. Genome-wide genotype and sequence-based reconstruction of the 140,000 year history of modern human ancestry.

    Science.gov (United States)

    Shriner, Daniel; Tekola-Ayele, Fasil; Adeyemo, Adebowale; Rotimi, Charles N

    2014-08-13

    We investigated ancestry of 3,528 modern humans from 163 samples. We identified 19 ancestral components, with 94.4% of individuals showing mixed ancestry. After using whole genome sequences to correct for ascertainment biases in genome-wide genotype data, we dated the oldest divergence event to 140,000 years ago. We detected an Out-of-Africa migration 100,000-87,000 years ago, leading to peoples of the Americas, east and north Asia, and Oceania, followed by another migration 61,000-44,000 years ago, leading to peoples of the Caucasus, Europe, the Middle East, and south Asia. We dated eight divergence events to 33,000-20,000 years ago, coincident with the Last Glacial Maximum. We refined understanding of the ancestry of several ethno-linguistic groups, including African Americans, Ethiopians, the Kalash, Latin Americans, Mozabites, Pygmies, and Uygurs, as well as the CEU sample. Ubiquity of mixed ancestry emphasizes the importance of accounting for ancestry in history, forensics, and health.

  9. HIGH REPETITION JUMP TRAINING COUPLED WITH BODY WEIGHT SUPPORT IN A PATIENT WITH KNEE PAIN AND PRIOR HISTORY OF ANTERIOR CRUCIATE LIGAMENT RECONSTRUCTION: A CASE REPORT.

    Science.gov (United States)

    Elias, Audrey R C; Kinney, Anthony E; Mizner, Ryan L

    2015-12-01

    Patients frequently experience long-term deficits in functional activity following anterior cruciate ligament reconstruction, and commonly present with decreased confidence and poor weight acceptance in the surgical knee. Adaptation of neuromuscular behaviors may be possible through plyometric training. Body weight support decreases intensity of landing sufficiently to allow increased training repetition. The purpose of this case report is to report the outcomes of a subject with a previous history of anterior cruciate ligament (ACL) reconstruction treated with high repetition jump training coupled with body weight support (BWS) as a primary intervention strategy. A 23-year old female, who had right ACL reconstruction seven years prior, presented with anterior knee pain and effusion following initiation of a running program. Following visual assessment of poor mechanics in single leg closed chain activities, landing mechanics were assessed using 3-D motion analysis of single leg landing off a 20 cm box. She then participated in an eight-week plyometric training program using a custom-designed body weight support system. The International Knee Documentation Committee Subjective Knee Form (IKDC) and the ACL-Return to Sport Index (ACL-RSI) were administered at the start and end of treatment as well as at follow-up testing. The subject's IKDC and ACL-RSI scores increased with training from 68% and 43% to 90% and 84%, respectively, and were retained at follow-up testing. Peak knee and hip flexion angles during landing increased from 47 ° and 53 ° to 72 ° and 80 ° respectively. Vertical ground reaction forces in landing decreased with training from 3.8 N/kg to 3.2 N/kg. All changes were retained two months following completion of training. The subject experienced meaningful changes in overall function. Retention of mechanical changes suggests that her new landing strategy had become a habitual pattern. Success with high volume plyometric training is

  10. New directions in hydro-climatic histories: observational data recovery, proxy records and the atmospheric circulation reconstructions over the earth (ACRE) initiative in Southeast Asia

    Science.gov (United States)

    Williamson, Fiona; Allan, Rob; Switzer, Adam D.; Chan, Johnny C. L.; Wasson, Robert James; D'Arrigo, Rosanne; Gartner, Richard

    2015-12-01

    The value of historic observational weather data for reconstructing long-term climate patterns and the detailed analysis of extreme weather events has long been recognized (Le Roy Ladurie, 1972; Lamb, 1977). In some regions however, observational data has not been kept regularly over time, or its preservation and archiving has not been considered a priority by governmental agencies. This has been a particular problem in Southeast Asia where there has been no systematic country-by-country method of keeping or preserving such data, the keeping of data only reaches back a few decades, or where instability has threatened the survival of historic records. As a result, past observational data are fragmentary, scattered, or even absent altogether. The further we go back in time, the more obvious the gaps. Observational data can be complimented however by historical documentary or proxy records of extreme events such as floods, droughts and other climatic anomalies. This review article highlights recent initiatives in sourcing, recovering, and preserving historical weather data and the potential for integrating the same with proxy (and other) records. In so doing, it focuses on regional initiatives for data research and recovery - particularly the work of the international Atmospheric Circulation Reconstructions over the Earth's (ACRE) Southeast Asian regional arm (ACRE SEA) - and the latter's role in bringing together disparate, but interrelated, projects working within this region. The overarching goal of the ACRE SEA initiative is to connect regional efforts and to build capacity within Southeast Asian institutions, agencies and National Meteorological and Hydrological Services (NMHS) to improve and extend historical instrumental, documentary and proxy databases of Southeast Asian hydroclimate, in order to contribute to the generation of high-quality, high-resolution historical hydroclimatic reconstructions (reanalyses) and, to build linkages with humanities researchers

  11. Bayesian modelling of fusion diagnostics

    Science.gov (United States)

    Fischer, R.; Dinklage, A.; Pasch, E.

    2003-07-01

    Integrated data analysis of fusion diagnostics is the combination of different, heterogeneous diagnostics in order to improve physics knowledge and reduce the uncertainties of results. One example is the validation of profiles of plasma quantities. Integration of different diagnostics requires systematic and formalized error analysis for all uncertainties involved. The Bayesian probability theory (BPT) allows a systematic combination of all information entering the measurement descriptive model that considers all uncertainties of the measured data, calibration measurements, physical model parameters and measurement nuisance parameters. A sensitivity analysis of model parameters allows crucial uncertainties to be found, which has an impact on both diagnostic improvement and design. The systematic statistical modelling within the BPT is used for reconstructing electron density and electron temperature profiles from Thomson scattering data from the Wendelstein 7-AS stellarator. The inclusion of different diagnostics and first-principle information is discussed in terms of improvements.

  12. ICDP project DeepCHALLA: reconstructing East African climate change and environmental history over the past 250,000 years

    Science.gov (United States)

    Verschuren, Dirk; Van Daele, Maarten; Wolff, Christian; Waldmann, Nicolas; Meyer, Inka; Ombori, Titus; Peterse, Francien; O'Grady, Ryan; Schnurrenberger, Doug; Olago, Daniel

    2017-04-01

    Sediments on the bottom of Lake Challa, a 92-meter deep crater lake on the border of Kenya and Tanzania near Mt. Kilimanjaro, contain a uniquely long and continuous record of past climate and environmental change. The near-equatorial location and exceptional quality of this natural archive provide great opportunities to study tropical climate variability at both short (inter-annual to decadal) and long (glacial-interglacial) time scales; and the influence of this climate variability on the region's freshwater resources, the functioning of terrestrial ecosystems, and the history of the East African landscape in which modern humans (our species, Homo sapiens) evolved and have lived ever since. Supported in part by the International Continental Scientific Drilling Programme (ICDP), the DeepCHALLA project has now recovered the sediment record of Lake Challa down to 214.8 meter below the lake floor, with almost certain 100% cover of the uppermost 121.3 meter (ca.150,000 year BP to present) and estimated 85% cover over the lower part of the sequence, down to the lowermost distinct reflector in the available seismic stratigraphy. This reflector represents a 2 meter thick layer of volcanic sand and silt deposited ca.250,000 years ago, and overlies still older silty lacustrine clays deposited during early lake development. Down-hole logging produced continuous profiles of in-situ sediment composition that confer an absolute depth scale to both the recovered cores and their three-dimensional representation in seismic stratigraphy. As readily observed through the transparent core liners, Lake Challa sediments are finely laminated throughout most of the recovered sequence. Combined with the great time span, the exquisite temporal resolution of these sediments promises to greatly increase our understanding of tropical climate and ecosystem dynamics, and create a long-awaited equatorial counterpart to the high-latitude climate records extracted from the ice sheets of Greenland

  13. A phylogenetic Kalman filter for ancestral trait reconstruction using molecular data.

    Science.gov (United States)

    Lartillot, Nicolas

    2014-02-15

    Correlation between life history or ecological traits and genomic features such as nucleotide or amino acid composition can be used for reconstructing the evolutionary history of the traits of interest along phylogenies. Thus far, however, such ancestral reconstructions have been done using simple linear regression approaches that do not account for phylogenetic inertia. These reconstructions could instead be seen as a genuine comparative regression problem, such as formalized by classical generalized least-square comparative methods, in which the trait of interest and the molecular predictor are represented as correlated Brownian characters coevolving along the phylogeny. Here, a Bayesian sampler is introduced, representing an alternative and more efficient algorithmic solution to this comparative regression problem, compared with currently existing generalized least-square approaches. Technically, ancestral trait reconstruction based on a molecular predictor is shown to be formally equivalent to a phylogenetic Kalman filter problem, for which backward and forward recursions are developed and implemented in the context of a Markov chain Monte Carlo sampler. The comparative regression method results in more accurate reconstructions and a more faithful representation of uncertainty, compared with simple linear regression. Application to the reconstruction of the evolution of optimal growth temperature in Archaea, using GC composition in ribosomal RNA stems and amino acid composition of a sample of protein-coding genes, confirms previous findings, in particular, pointing to a hyperthermophilic ancestor for the kingdom. The program is freely available at www.phylobayes.org.

  14. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: first report from Tropical Asia Core (TACO) project.

    Science.gov (United States)

    Boonyatumanond, Ruchaya; Wattayakorn, Gullaya; Amano, Atsuko; Inouchi, Yoshio; Takada, Hideshige

    2007-05-01

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of (137)Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform.

  15. Reconstructing the history of maize streak virus strain a dispersal to reveal diversification hot spots and its origin in southern Africa.

    Science.gov (United States)

    Monjane, Adérito L; Harkins, Gordon W; Martin, Darren P; Lemey, Philippe; Lefeuvre, Pierre; Shepherd, Dionne N; Oluwafemi, Sunday; Simuyandi, Michelo; Zinga, Innocent; Komba, Ephrem K; Lakoutene, Didier P; Mandakombo, Noella; Mboukoulida, Joseph; Semballa, Silla; Tagne, Appolinaire; Tiendrébéogo, Fidèle; Erdmann, Julia B; van Antwerpen, Tania; Owor, Betty E; Flett, Bradley; Ramusi, Moses; Windram, Oliver P; Syed, Rizwan; Lett, Jean-Michel; Briddon, Rob W; Markham, Peter G; Rybicki, Edward P; Varsani, Arvind

    2011-09-01

    Maize streak virus strain A (MSV-A), the causal agent of maize streak disease, is today one of the most serious biotic threats to African food security. Determining where MSV-A originated and how it spread transcontinentally could yield valuable insights into its historical emergence as a crop pathogen. Similarly, determining where the major extant MSV-A lineages arose could identify geographical hot spots of MSV evolution. Here, we use model-based phylogeographic analyses of 353 fully sequenced MSV-A isolates to reconstruct a plausible history of MSV-A movements over the past 150 years. We show that since the probable emergence of MSV-A in southern Africa around 1863, the virus spread transcontinentally at an average rate of 32.5 km/year (95% highest probability density interval, 15.6 to 51.6 km/year). Using distinctive patterns of nucleotide variation caused by 20 unique intra-MSV-A recombination events, we tentatively classified the MSV-A isolates into 24 easily discernible lineages. Despite many of these lineages displaying distinct geographical distributions, it is apparent that almost all have emerged within the past 4 decades from either southern or east-central Africa. Collectively, our results suggest that regular analysis of MSV-A genomes within these diversification hot spots could be used to monitor the emergence of future MSV-A lineages that could affect maize cultivation in Africa.

  16. Thermal-history reconstruction of the Baiyun Sag in the deep-water area of the Pearl River Mouth Basin, northern South China Sea

    Science.gov (United States)

    Tang, Xiaoyin; Yang, Shuchun; Hu, Shengbiao

    2017-11-01

    The Baiyun Sag, located in the deep-water area of the northern South China Sea, is the largest and deepest subbasin in the Pearl River Mouth Basin and one of the most important hydrocarbon-accumulation depression areas in China. Thermal history is widely thought to be of great importance in oil and gas potential assessment of a basin as it controls the timing of hydrocarbon generation and expulsion from the source rock. In order to unravel the paleo-heat flow of the Baiyun Sag, we first analyzed tectonic subsidence of 55 pseudo-wells constructed based on newly interpreted seismic profiles, along with three drilled wells. We then carried out thermal modeling using the multi-stage finite stretching method and calibrated the results using collected present-day vitrinite reflectance data and temperature data. Results indicate that the first and second heating of the Baiyun Sag after 49 Ma ceased at 33.9 Ma and 23 Ma. Reconstructed average basal paleoheat flow values at the end of the rifting periods are 57.7-86.2 mW/m2 and 66.7-97.3 mW/m2, respectively. Following the last heating period at 23 Ma, the study area has undergone a persistent thermal attenuation phase, and basal heat flow has cooled down to 64.0-79.2 mW/m2 at present.

  17. Reconstruction of pollution history of organic contaminants in the upper Gulf of Thailand by using sediment cores: First report from Tropical Asia Core (TACO) project

    International Nuclear Information System (INIS)

    Boonyatumanond, Ruchaya; Wattayakorn, Gullaya; Amano, Atsuko; Inouchi, Yoshio; Takada, Hideshige

    2007-01-01

    This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of 137 Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform

  18. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  19. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  20. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  1. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  2. Practical Bayesian tomography

    Science.gov (United States)

    Granade, Christopher; Combes, Joshua; Cory, D. G.

    2016-03-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  3. Variational Bayesian Filtering

    Czech Academy of Sciences Publication Activity Database

    Šmídl, Václav; Quinn, A.

    2008-01-01

    Roč. 56, č. 10 (2008), s. 5020-5030 ISSN 1053-587X R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian filtering * particle filtering * Variational Bayes Subject RIV: BC - Control Systems Theory Impact factor: 2.335, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/smidl-variational bayesian filtering.pdf

  4. Bayesian Networks An Introduction

    CERN Document Server

    Koski, Timo

    2009-01-01

    Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni

  5. Vegetation history and climate variability since 1.3kaBP reconstructed from high-resolution multiproxy analysis of mountainous peat sediment, Southeast China

    Science.gov (United States)

    Ma, Chunmei; Cui, Anning; Fang, Yiman; Zhao, Lin; Jia, Yulian

    2017-04-01

    Climate change during the last two millennia is one of the most important focuses of the "Past Global Changes" (PAGES) initiative. In this study, vegetation history and climate variability since 1.3kaBP was reconstructed from high-resolution multiproxy analysis of mountainous peat sediment from the central part of a swamp in Jiangxi Province, China. 210Pb, 137Cs and AMS14C dating were used to build the age framework on the basis of Bacon model. Pollen, Humification degree (HD), Loss-on ignition (LOI), XRF scan elements and grain-size distribution were analyzed. During 637-800 AD, the vegetation combination consists of upland herbs taxa and scattered evergreen Quercus (Quercus E). However, the pollen concentration was very low, and plant genera were seldom. Since harsh environment is not conducive to pollen storage, vegetation condition reconstructed by pollen information cannot reflect real climate change. During the Medieval Warm Period (MWP, 800-1250 AD) vegetation is abundant through the entire period, Quercus E is the building group of the forest, Pinus and Castanopsis are sporadic. Upland herbs grew up vigorously in the lower part of forest. Peat began to accumulate in the basin high terrain, where wetland herbs grew vigorous. The climate during MWP was characterized by warm and wet, inside there were obvious secondary fluctuations. Dramatic vegetation changes were recorded during the Little Ice Age LIA,1340-1870 AD). The vegetation community was primarily dominated by Castanopsis, upland land herbs thrive; wetland herbs were sparse with great fluctuations depending on changes in the humidity. Overall, during LIA, temperature pattern was featured by "four cold period and three warm period", and humidity condition was experienced a process from drought to wet. Periodic analysis of the moisture proxy (PCA 1) and temperature indicator (E/D: evergreen/deciduous tree pollen) shows cyclic fluctuations of 150 years in the temperature and precipitation, which is

  6. Entangled histories

    International Nuclear Information System (INIS)

    Cotler, Jordan; Wilczek, Frank

    2016-01-01

    We introduce quantum history states and their mathematical framework, thereby reinterpreting and extending the consistent histories approach to quantum theory. Through thought experiments, we demonstrate that our formalism allows us to analyze a quantum version of history in which we reconstruct the past by observations. In particular, we can pass from measurements to inferences about ‘what happened’ in a way that is sensible and free of paradox. Our framework allows for a richer understanding of the temporal structure of quantum theory, and we construct history states that embody peculiar, non-classical correlations in time. (paper)

  7. Reconstructing Native American population history.

    Science.gov (United States)

    Reich, David; Patterson, Nick; Campbell, Desmond; Tandon, Arti; Mazieres, Stéphane; Ray, Nicolas; Parra, Maria V; Rojas, Winston; Duque, Constanza; Mesa, Natalia; García, Luis F; Triana, Omar; Blair, Silvia; Maestre, Amanda; Dib, Juan C; Bravi, Claudio M; Bailliet, Graciela; Corach, Daniel; Hünemeier, Tábita; Bortolini, Maria Cátira; Salzano, Francisco M; Petzl-Erler, María Luiza; Acuña-Alonzo, Victor; Aguilar-Salinas, Carlos; Canizales-Quinteros, Samuel; Tusié-Luna, Teresa; Riba, Laura; Rodríguez-Cruz, Maricela; Lopez-Alarcón, Mardia; Coral-Vazquez, Ramón; Canto-Cetina, Thelma; Silva-Zolezzi, Irma; Fernandez-Lopez, Juan Carlos; Contreras, Alejandra V; Jimenez-Sanchez, Gerardo; Gómez-Vázquez, Maria José; Molina, Julio; Carracedo, Angel; Salas, Antonio; Gallo, Carla; Poletti, Giovanni; Witonsky, David B; Alkorta-Aranburu, Gorka; Sukernik, Rem I; Osipova, Ludmila; Fedorova, Sardana A; Vasquez, René; Villena, Mercedes; Moreau, Claudia; Barrantes, Ramiro; Pauls, David; Excoffier, Laurent; Bedoya, Gabriel; Rothhammer, Francisco; Dugoujon, Jean-Michel; Larrouy, Georges; Klitz, William; Labuda, Damian; Kidd, Judith; Kidd, Kenneth; Di Rienzo, Anna; Freimer, Nelson B; Price, Alkes L; Ruiz-Linares, Andrés

    2012-08-16

    The peopling of the Americas has been the subject of extensive genetic, archaeological and linguistic research; however, central questions remain unresolved. One contentious issue is whether the settlement occurred by means of a single migration or multiple streams of migration from Siberia. The pattern of dispersals within the Americas is also poorly understood. To address these questions at a higher resolution than was previously possible, we assembled data from 52 Native American and 17 Siberian groups genotyped at 364,470 single nucleotide polymorphisms. Here we show that Native Americans descend from at least three streams of Asian gene flow. Most descend entirely from a single ancestral population that we call 'First American'. However, speakers of Eskimo-Aleut languages from the Arctic inherit almost half their ancestry from a second stream of Asian gene flow, and the Na-Dene-speaking Chipewyan from Canada inherit roughly one-tenth of their ancestry from a third stream. We show that the initial peopling followed a southward expansion facilitated by the coast, with sequential population splits and little gene flow after divergence, especially in South America. A major exception is in Chibchan speakers on both sides of the Panama isthmus, who have ancestry from both North and South America.

  8. Reconstructing Native American Population History

    Science.gov (United States)

    Reich, David; Patterson, Nick; Campbell, Desmond; Tandon, Arti; Mazieres, Stéphane; Ray, Nicolas; Parra, Maria V.; Rojas, Winston; Duque, Constanza; Mesa, Natalia; García, Luis F.; Triana, Omar; Blair, Silvia; Maestre, Amanda; Dib, Juan C.; Bravi, Claudio M.; Bailliet, Graciela; Corach, Daniel; Hünemeier, Tábita; Bortolini, Maria-Cátira; Salzano, Francisco M.; Petzl-Erler, María Luiza; Acuña-Alonzo, Victor; Aguilar-Salinas, Carlos; Canizales-Quinteros, Samuel; Tusié-Luna, Teresa; Riba, Laura; Rodríguez-Cruz, Maricela; Lopez-Alarcón, Mardia; Coral-Vazquez, Ramón; Canto-Cetina, Thelma; Silva-Zolezzi, Irma; Fernandez-Lopez, Juan Carlos; Contreras, Alejandra V.; Jimenez-Sanchez, Gerardo; Gómez-Vázquez, María José; Molina, Julio; Carracedo, Ángel; Salas, Antonio; Gallo, Carla; Poletti, Giovanni; Witonsky, David B.; Alkorta-Aranburu, Gorka; Sukernik, Rem I.; Osipova, Ludmila; Fedorova, Sardana; Vasquez, René; Villena, Mercedes; Moreau, Claudia; Barrantes, Ramiro; Pauls, David; Excoffier, Laurent; Bedoya, Gabriel; Rothhammer, Francisco; Dugoujon, Jean Michel; Larrouy, Georges; Klitz, William; Labuda, Damian; Kidd, Judith; Kidd, Kenneth; Rienzo, Anna Di; Freimer, Nelson B.; Price, Alkes L.; Ruiz-Linares, Andrés

    2013-01-01

    The peopling of the Americas has been the subject of extensive genetic, archaeological and linguistic research; however, central questions remain unresolved1–5. One contentious issue is whether the settlement occurred via a single6–8 or multiple streams of migration from Siberia9–15. The pattern of dispersals within the Americas is also poorly understood. To address these questions at higher resolution than was previously possible, we assembled data from 52 Native American and 17 Siberian groups genotyped at 364,470 single nucleotide polymorphisms. We show that Native Americans descend from at least three streams of Asian gene flow. Most descend entirely from a single ancestral population that we call “First American”. However, speakers of Eskimo-Aleut languages from the Arctic inherit almost half their ancestry from a second stream of Asian gene flow, and the Na-Dene-speaking Chipewyan from Canada inherit roughly one-tenth of their ancestry from a third stream. We show that the initial peopling followed a southward expansion facilitated by the coast, with sequential population splits and little gene flow after divergence, especially in South America. A major exception is in Chibchan-speakers on both sides of the Panama Isthmus, who have ancestry from both North and South America. PMID:22801491

  9. Genome rearrangements and phylogeny reconstruction in Yersinia pestis.

    Science.gov (United States)

    Bochkareva, Olga O; Dranenko, Natalia O; Ocheredko, Elena S; Kanevsky, German M; Lozinsky, Yaroslav N; Khalaycheva, Vera A; Artamonova, Irena I; Gelfand, Mikhail S

    2018-01-01

    Genome rearrangements have played an important role in the evolution of Yersinia pestis from its progenitor Yersinia pseudotuberculosis . Traditional phylogenetic trees for Y. pestis based on sequence comparison have short internal branches and low bootstrap supports as only a small number of nucleotide substitutions have occurred. On the other hand, even a small number of genome rearrangements may resolve topological ambiguities in a phylogenetic tree. We reconstructed phylogenetic trees based on genome rearrangements using several popular approaches such as Maximum likelihood for Gene Order and the Bayesian model of genome rearrangements by inversions. We also reconciled phylogenetic trees for each of the three CRISPR loci to obtain an integrated scenario of the CRISPR cassette evolution. Analysis of contradictions between the obtained evolutionary trees yielded numerous parallel inversions and gain/loss events. Our data indicate that an integrated analysis of sequence-based and inversion-based trees enhances the resolution of phylogenetic reconstruction. In contrast, reconstructions of strain relationships based on solely CRISPR loci may not be reliable, as the history is obscured by large deletions, obliterating the order of spacer gains. Similarly, numerous parallel gene losses preclude reconstruction of phylogeny based on gene content.

  10. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  11. Bayesian Decision Support

    Science.gov (United States)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  12. Bayesian Exploratory Factor Analysis

    Science.gov (United States)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  13. Approximate Bayesian evaluations of measurement uncertainty

    Science.gov (United States)

    Possolo, Antonio; Bodnar, Olha

    2018-04-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) includes formulas that produce an estimate of a scalar output quantity that is a function of several input quantities, and an approximate evaluation of the associated standard uncertainty. This contribution presents approximate, Bayesian counterparts of those formulas for the case where the output quantity is a parameter of the joint probability distribution of the input quantities, also taking into account any information about the value of the output quantity available prior to measurement expressed in the form of a probability distribution on the set of possible values for the measurand. The approximate Bayesian estimates and uncertainty evaluations that we present have a long history and illustrious pedigree, and provide sufficiently accurate approximations in many applications, yet are very easy to implement in practice. Differently from exact Bayesian estimates, which involve either (analytical or numerical) integrations, or Markov Chain Monte Carlo sampling, the approximations that we describe involve only numerical optimization and simple algebra. Therefore, they make Bayesian methods widely accessible to metrologists. We illustrate the application of the proposed techniques in several instances of measurement: isotopic ratio of silver in a commercial silver nitrate; odds of cryptosporidiosis in AIDS patients; height of a manometer column; mass fraction of chromium in a reference material; and potential-difference in a Zener voltage standard.

  14. Bayesian hypothesis testing: Editorial to the Special Issue on Bayesian data analysis.

    Science.gov (United States)

    Hoijtink, Herbert; Chow, Sy-Miin

    2017-06-01

    In the past 20 years, there has been a steadily increasing attention and demand for Bayesian data analysis across multiple scientific disciplines, including psychology. Bayesian methods and the related Markov chain Monte Carlo sampling techniques offered renewed ways of handling old and challenging new problems that may be difficult or impossible to handle using classical approaches. Yet, such opportunities and potential improvements have not been sufficiently explored and investigated. This is 1 of 2 special issues in Psychological Methods dedicated to the topic of Bayesian data analysis, with an emphasis on Bayesian hypothesis testing, model comparison, and general guidelines for applications in psychology. In this editorial, we provide an overview of the use of Bayesian methods in psychological research and a brief history of the Bayes factor and the posterior predictive p value. Translational abstracts that summarize the articles in this issue in very clear and understandable terms are included in the Appendix. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. Bayesian phylogeography of the Arawak expansion in lowland South America.

    Science.gov (United States)

    Walker, Robert S; Ribeiro, Lincoln A

    2011-09-07

    Phylogenetic inference based on language is a vital tool for tracing the dynamics of human population expansions. The timescale of agriculture-based expansions around the world provides an informative amount of linguistic change ideal for reconstructing phylogeographies. Here we investigate the expansion of Arawak, one of the most widely dispersed language families in the Americas, scattered from the Antilles to Argentina. It has been suggested that Northwest Amazonia is the Arawak homeland based on the large number of diverse languages in the region. We generate language trees by coding cognates of basic vocabulary words for 60 Arawak languages and dialects to estimate the phylogenetic relationships among Arawak societies, while simultaneously implementing a relaxed random walk model to infer phylogeographic history. Estimates of the Arawak homeland exclude Northwest Amazonia and are bi-modal, with one potential homeland on the Atlantic seaboard and another more likely origin in Western Amazonia. Bayesian phylogeography better supports a Western Amazonian origin, and consequent dispersal to the Caribbean and across the lowlands. Importantly, the Arawak expansion carried with it not only language but also a number of cultural traits that contrast Arawak societies with other lowland cultures.

  16. Tectonic History and Deep Structure of the Demerara Plateau from Combined Wide-Angle and Reflection Seismic Data and Plate Kinematic Reconstructions

    Science.gov (United States)

    Klingelhoefer, F.; Museur, T.; Roest, W. R.; Graindorge, D.; Chauvet, F.; Loncke, L.; Basile, C.; Poetisi, E.; Deverchere, J.; Lebrun, J. F.; Perrot, J.; Heuret, A.

    2017-12-01

    Many transform margins have associated intermediate depth marginal plateaus, which are commonly located between two oceanic basins. The Demerara plateau is located offshore Surinam and French Guiana. Plate kinematic reconstructions show that the plateau is located between the central and equatorial Atlantic in a position conjugate to the Guinean Plateau. In the fall of 2016, the MARGATS cruise acquired geophysical data along the 400 km wide Demerara plateau. The main objective of the cruise was to image the deep structure of the Demerara plateau and to study its tectonic history. A set of 4 combined wide-angle and reflection seismic profiles was acquired along the plateau, using 80 ocean-bottom seismometers, a 3 km long seismic streamer and a 8000 cu inch tuned airgun array. Forward modelling of the wide-angle seismic data on a profile, located in the eastern part of the plateau and oriented in a NE-SW direction, images the crustal structure of the plateau, the transition zone and the neighbouring crust of oceanic origin, up to a depth of 40 km. The plateau itself is characterised by a crust of 30 km thickness, subdivided into three distinct layers. However, the velocities and velocity gradients do not fit typical continental crust, with a lower crustal layer showing untypically high velocities and an upper layer having a steep velocity gradient. From this model we propose that the lowermost layer is probably formed from volcanic underplated material and that the upper crustal layer likely consists of the corresponding extrusive volcanic material, forming thick seaward-dipping reflector sequences on the plateau. A basement high is imaged at the foot of the slope and forms the ocean-continent transition zone. Further oceanward, a 5-6 km thick crust is imaged with velocities and velocity gradients corresponding to a thin oceanic crust. A compilation of magnetic data from the MARGATS and 3 previous cruises shows a high amplitude magnetic anomaly along the northern

  17. Development and sustainability of NSF-funded climate change education efforts: lessons learned and strategies used to develop the Reconstructing Earth's Climate History (REaCH) curriculum (Invited)

    Science.gov (United States)

    St John, K. K.; Jones, M. H.; Leckie, R. M.; Pound, K. S.; Krissek, L. A.

    2013-12-01

    develop detailed instructor guides to accompany each module. After careful consideration of dissemination options, we choose to publish the full suite of exercise modules as a commercially-available book, Reconstructing Earth's Climate History, while also providing open online access to a subset of modules. Its current use in undergraduate paleoclimatology courses, and the availability of select modules for use in other courses demonstrate that creative, hybrid options can be found for lasting dissemination, and thus sustainability. In achieving our goal of making science accessible, we believe we have followed a curriculum development process and sustainability path that can be used by others to meet needs in earth, ocean, and atmospheric science education. Next steps for REaCH include exploration of its use in blended learning classrooms, and at minority serving institutions.

  18. Reconstructing Southern Greenland Ice Sheet History During the Plio-Pleistocene Intensification of Northern Hemisphere Glaciation: Insights from IODP Site U1307

    Science.gov (United States)

    Blake-Mizen, K. R.; Hatfield, R. G.; Carlson, A. E.; Walczak, M. H.; Stoner, J. S.; Xuan, C.; Lawrence, K. T.; Bailey, I.

    2017-12-01

    Should it melt entirely, the Greenland Ice Sheet (GrIS) has the potential to raise global sea-level by 7 metres. With the Arctic continuing to warm at a remarkable rate, to better understand how the GrIS will respond to future anthropogenically-induced climate change we must constrain its natural variability in the geological past. In this regard, much uncertainty exists surrounding its pre-Quaternary history; particularly during the mid-Piacenzian warm period (mPWP; 3.3-3.0 Ma) - widely considered an analogue for near-future equilibrium climate with modern atmospheric CO2 levels and elevated temperatures relative to today - and the late Pliocene/early Pleistocene onset of widespread Northern Hemisphere glaciation (NHG, 2.7 Ma). GrIS reconstructions for these intervals have been largely hampered by a lack of well-dated, high-resolution records from suitable sites. To address this, we present new high-resolution, multi-proxy records from IODP Site U1307, a North Atlantic marine sediment core recovered from the Eirik Drift just south of Greenland. Generation of a new high-resolution relative palaeointensity (RPI)-based age-model - representing the first of its kind for high-latitude sediments deposited during NHG - has enabled strong orbital age control. Our ice-rafted debris (IRD) record confirms a 2.72 Ma initiation of major southern GrIS marine-terminating glaciations, which appear to persist even through interglacial periods up to at least 2.24 Ma. XRF-scanning and IRD evidence suggests, however, that an ephemeral ice-cap of likely considerable size persisted on southern Greenland prior to the mPWP. These data, together with the analysed provenance of individual IRD, indicate marine-based GrIS margins extended southward over the NHG interval and only occurred on Greenland's southern tip from 2.7 Ma. Despite a large increase in the deposition of GrIS-derived IRD from this time, bulk sedimentation rates and magnetic grain-size dropped significantly, implying that

  19. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  20. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  1. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  2. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  3. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.......In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  4. Bayesian coronal seismology

    Science.gov (United States)

    Arregui, Iñigo

    2018-01-01

    In contrast to the situation in a laboratory, the study of the solar atmosphere has to be pursued without direct access to the physical conditions of interest. Information is therefore incomplete and uncertain and inference methods need to be employed to diagnose the physical conditions and processes. One of such methods, solar atmospheric seismology, makes use of observed and theoretically predicted properties of waves to infer plasma and magnetic field properties. A recent development in solar atmospheric seismology consists in the use of inversion and model comparison methods based on Bayesian analysis. In this paper, the philosophy and methodology of Bayesian analysis are first explained. Then, we provide an account of what has been achieved so far from the application of these techniques to solar atmospheric seismology and a prospect of possible future extensions.

  5. Bayesian community detection.

    Science.gov (United States)

    Mørup, Morten; Schmidt, Mikkel N

    2012-09-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled.

  6. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  7. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  8. Bayesian networks in reliability

    Energy Technology Data Exchange (ETDEWEB)

    Langseth, Helge [Department of Mathematical Sciences, Norwegian University of Science and Technology, N-7491 Trondheim (Norway)]. E-mail: helgel@math.ntnu.no; Portinale, Luigi [Department of Computer Science, University of Eastern Piedmont ' Amedeo Avogadro' , 15100 Alessandria (Italy)]. E-mail: portinal@di.unipmn.it

    2007-01-15

    Over the last decade, Bayesian networks (BNs) have become a popular tool for modelling many kinds of statistical problems. We have also seen a growing interest for using BNs in the reliability analysis community. In this paper we will discuss the properties of the modelling framework that make BNs particularly well suited for reliability applications, and point to ongoing research that is relevant for practitioners in reliability.

  9. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....

  10. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  11. Reconstruction of the Transmission History of RNA Virus Outbreaks Using Full Genome Sequences: Foot-and-Mouth Disease Virus in Bulgaria in 2011

    DEFF Research Database (Denmark)

    Valdazo-González, Begoña; Polihronova, Lilyana; Alexandrov, Tsviatko

    2012-01-01

    Improvements to sequencing protocols and the development of computational phylogenetics have opened up opportunities to study the rapid evolution of RNA viruses in real time. In practical terms, these results can be combined with field data in order to reconstruct spatiotemporal scenarios...... that describe the origin and transmission pathways of viruses during an epidemic. In the case of notifiable diseases, such as foot-and-mouth disease (FMD), these analyses provide important insights into the epidemiology of field outbreaks that can support disease control programmes. This study reconstructs...

  12. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  13. Comparing and improving reconstruction methods for proxies based on compositional data

    Science.gov (United States)

    Nolan, C.; Tipton, J.; Booth, R.; Jackson, S. T.; Hooten, M.

    2017-12-01

    Many types of studies in paleoclimatology and paleoecology involve compositional data. Often, these studies aim to use compositional data to reconstruct an environmental variable of interest; the reconstruction is usually done via the development of a transfer function. Transfer functions have been developed using many different methods. Existing methods tend to relate the compositional data and the reconstruction target in very simple ways. Additionally, the results from different methods are rarely compared. Here we seek to address these two issues. First, we introduce a new hierarchical Bayesian multivariate gaussian process model; this model allows for the relationship between each species in the compositional dataset and the environmental variable to be modeled in a way that captures the underlying complexities. Then, we compare this new method to machine learning techniques and commonly used existing methods. The comparisons are based on reconstructing the water table depth history of Caribou Bog (an ombrotrophic Sphagnum peat bog in Old Town, Maine, USA) from a new 7500 year long record of testate amoebae assemblages. The resulting reconstructions from different methods diverge in both their resulting means and uncertainties. In particular, uncertainty tends to be drastically underestimated by some common methods. These results will help to improve inference of water table depth from testate amoebae. Furthermore, this approach can be applied to test and improve inferences of past environmental conditions from a broad array of paleo-proxies based on compositional data

  14. River history.

    Science.gov (United States)

    Vita-Finzi, Claudio

    2012-05-13

    During the last half century, advances in geomorphology-abetted by conceptual and technical developments in geophysics, geochemistry, remote sensing, geodesy, computing and ecology-have enhanced the potential value of fluvial history for reconstructing erosional and depositional sequences on the Earth and on Mars and for evaluating climatic and tectonic changes, the impact of fluvial processes on human settlement and health, and the problems faced in managing unstable fluvial systems. This journal is © 2012 The Royal Society

  15. Regional variation in otolith Sr:Ca ratios of African longfinned eel Anguilla mossambica and mottled eel Anguilla marmorata: a challenge to the classic tool for reconstructing migratory histories of fishes.

    Science.gov (United States)

    Lin, Y-J; Jessop, B M; Weyl, O L F; Iizuka, Y; Lin, S-H; Tzeng, W-N; Sun, C-L

    2012-07-01

    Otolith Sr:Ca ratios of the African longfinned eel Anguilla mossambica and giant mottled eel Anguilla marmorata from nine freshwater sites in four rivers of South Africa were analysed to reconstruct their migratory life histories between freshwater and saltwater habitats. For A. mossambica, the Sr:Ca ratios in the otolith edge differed significantly among rivers and had large effect sizes, but did not differ among sites within a river. Otolith Sr:Ca ratios did not differ among rivers for A. marmorata. When rivers were pooled, the edge Sr:Ca ratios of A. mossambica were not significantly different from those of A. marmorata. According to the river-specific critical Sr:Ca ratio distinguishing freshwater from saltwater residence, most A. mossambica and A. marmorata had saltwater habitat experience after settlement in fresh water. This was primarily during their elver stage or early in the yellow eel stage. During the middle and late yellow eel stage, freshwater residency was preferred and only sporadic visits were made to saltwater habitats. The data also suggest that regional variations in otolith Sr:Ca ratios affect the critical Sr:Ca value and are a challenge for the reconstruction of migratory life histories that should be explicitly considered to avoid bias and uncertainty. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  16. From the history of the recognitions of the remains to the reconstruction of the face of Dante Alighieri by means of techniques of virtual reality and forensic anthropology

    Directory of Open Access Journals (Sweden)

    Stefano Benazzi

    2007-07-01

    Full Text Available The work consists of the reconstruction of the face of the great poet called Dante Alighieri through a multidisciplinary approach that matches traditional techniques (manual ones, usually used in forensic anthropology, with digital methodologies that take advantage of technologies born in manufacturer-military fields but that are more and more often applied to the field of the cultural heritage. Unable to get the original skull of Dante, the work started from the data and the elements collected by Fabio Frassetto and Giuseppe Sergi, two important anthropologists, respectively at the University of Bologna and Rome, in an investigation carried out in 1921, sixth century anniversary of his death, on the remains of the poet collected in Ravenna. Thanks to this, we have a very accurate description of Dante’s bones, including 297 metric data inherent to the whole skeleton, some photographs in the scale of the skull, the various norms and many other bones, as well as a model of the skull subsequently realized by Frassetto. According to these information, a geometric reconstruction of Dante Alighieri skull including the jaw was carried out through the employment and integration of the instruments and technologies of the virtual reality, and from this the relative physical model through fast prototype was realized. An important aspect of the work regards in a particular manner the methodology of 3D modelling proposed for the new reconstruction of the jaw (not found in the course of the 1921 recognition, starting from a reference model. The model of the skull prototype is then useful as the basis for the successive stage of facial reconstruction through the traditional techniques of forensic art.

  17. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  18. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  19. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to

  20. Bayesian Data Analysis (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  1. Bayesian Data Analysis (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  2. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne

    2011-01-01

    Studies of complications following reconstructive surgery with implants among women with breast cancer are needed. As the, to our knowledge, first prospective long-term study we evaluated the occurrence of complications following delayed breast reconstruction separately for one- and two......-stage procedures. From the Danish Registry for Plastic Surgery of the Breast, which has prospectively registered data for women undergoing breast implantations since 1999, we identified 559 women without a history of radiation therapy undergoing 592 delayed breast reconstructions following breast cancer during...... of reoperation was significantly higher following the one-stage procedure. For both procedures, the majority of reoperations were due to asymmetry or displacement of the implant. In conclusion, non-radiated one- and two-stage delayed breast implant reconstructions are associated with substantial risks...

  3. The Bayesian Covariance Lasso.

    Science.gov (United States)

    Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G

    2013-04-01

    Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.

  4. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  5. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  6. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  7. Profile reconstruction from neutron reflectivity data and a priori knowledge

    International Nuclear Information System (INIS)

    Leeb, H.

    2008-01-01

    The problem of incomplete and noisy information in profile reconstruction from neutron reflectometry data is considered. In particular methods of Bayesian statistics in combination with modelling or inverse scattering techniques are considered in order to properly include the required a priori knowledge to obtain quantitatively reliable estimates of the reconstructed profiles. Applying Bayes theorem the results of different experiments on the same sample can be consistently included in the profile reconstruction

  8. Vertex Reconstruction for AEGIS’ FACT Detector

    CERN Document Server

    Themistokleous, Neofytos

    2017-01-01

    My project dealt with the development of a vertex reconstruction technique to discriminate antihydrogen from background signals in the AEGIS apparatus. It involved the creation of a Toy Monte-Carlo to simulate particle annihilation events, and a vertex reconstruction utility based on the Bayesian theory of probability. The first results based on 107 generated events with single track in the detector are encouraging. For such events, the algorithm can reconstruct the z-coordinate accurately , while for the r-coordinate the result is less accurate.

  9. Impact of Quaternary climatic changes and interspecific competition on the demographic history of a highly mobile generalist carnivore, the coyote.

    Science.gov (United States)

    Koblmüller, Stephan; Wayne, Robert K; Leonard, Jennifer A

    2012-08-23

    Recurrent cycles of climatic change during the Quaternary period have dramatically affected the population genetic structure of many species. We reconstruct the recent demographic history of the coyote (Canis latrans) through the use of Bayesian techniques to examine the effects of Late Quaternary climatic perturbations on the genetic structure of a highly mobile generalist species. Our analysis reveals a lack of phylogeographic structure throughout the range but past population size changes correlated with climatic changes. We conclude that even generalist carnivorous species are very susceptible to environmental changes associated with climatic perturbations. This effect may be enhanced in coyotes by interspecific competition with larger carnivores.

  10. Specialist and generalist symbionts show counterintuitive levels of genetic diversity and discordant demographic histories along the Florida Reef Tract

    Science.gov (United States)

    Titus, Benjamin M.; Daly, Marymegan

    2017-03-01

    Specialist and generalist life histories are expected to result in contrasting levels of genetic diversity at the population level, and symbioses are expected to lead to patterns that reflect a shared biogeographic history and co-diversification. We test these assumptions using mtDNA sequencing and a comparative phylogeographic approach for six co-occurring crustacean species that are symbiotic with sea anemones on western Atlantic coral reefs, yet vary in their host specificities: four are host specialists and two are host generalists. We first conducted species discovery analyses to delimit cryptic lineages, followed by classic population genetic diversity analyses for each delimited taxon, and then reconstructed the demographic history for each taxon using traditional summary statistics, Bayesian skyline plots, and approximate Bayesian computation to test for signatures of recent and concerted population expansion. The genetic diversity values recovered here contravene the expectations of the specialist-generalist variation hypothesis and classic population genetics theory; all specialist lineages had greater genetic diversity than generalists. Demography suggests recent population expansions in all taxa, although Bayesian skyline plots and approximate Bayesian computation suggest the timing and magnitude of these events were idiosyncratic. These results do not meet the a priori expectation of concordance among symbiotic taxa and suggest that intrinsic aspects of species biology may contribute more to phylogeographic history than extrinsic forces that shape whole communities. The recovery of two cryptic specialist lineages adds an additional layer of biodiversity to this symbiosis and contributes to an emerging pattern of cryptic speciation in the specialist taxa. Our results underscore the differences in the evolutionary processes acting on marine systems from the terrestrial processes that often drive theory. Finally, we continue to highlight the Florida Reef

  11. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  12. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... represents the spatial coordinates of the grid nodes. Knowledge of how grid nodes are depicted in the observed image is described through the observation model. The prior consists of a node prior and an arc (edge) prior, both modeled as Gaussian MRFs. The node prior models variations in the positions of grid...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  13. Bayesian supervised dimensionality reduction.

    Science.gov (United States)

    Gönen, Mehmet

    2013-12-01

    Dimensionality reduction is commonly used as a preprocessing step before training a supervised learner. However, coupled training of dimensionality reduction and supervised learning steps may improve the prediction performance. In this paper, we introduce a simple and novel Bayesian supervised dimensionality reduction method that combines linear dimensionality reduction and linear supervised learning in a principled way. We present both Gibbs sampling and variational approximation approaches to learn the proposed probabilistic model for multiclass classification. We also extend our formulation toward model selection using automatic relevance determination in order to find the intrinsic dimensionality. Classification experiments on three benchmark data sets show that the new model significantly outperforms seven baseline linear dimensionality reduction algorithms on very low dimensions in terms of generalization performance on test data. The proposed model also obtains the best results on an image recognition task in terms of classification and retrieval performances.

  14. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...... locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...... parameter values are unknown. The results show that in this situation a wide range of interpoint distances should be included in the design, and the widely used regular design is often not the best choice....

  15. Delayed breast implant reconstruction

    DEFF Research Database (Denmark)

    Hvilsom, Gitte B.; Hölmich, Lisbet R.; Steding-Jessen, Marianne

    2012-01-01

    We evaluated the association between radiation therapy and severe capsular contracture or reoperation after 717 delayed breast implant reconstruction procedures (288 1- and 429 2-stage procedures) identified in the prospective database of the Danish Registry for Plastic Surgery of the Breast during...... of radiation therapy was associated with a non-significantly increased risk of reoperation after both 1-stage (HR = 1.4; 95% CI: 0.7-2.5) and 2-stage (HR = 1.6; 95% CI: 0.9-3.1) procedures. Reconstruction failure was highest (13.2%) in the 2-stage procedures with a history of radiation therapy. Breast...

  16. Bayesian adaptive methods for clinical trials

    CERN Document Server

    Berry, Scott M; Muller, Peter

    2010-01-01

    Already popular in the analysis of medical device trials, adaptive Bayesian designs are increasingly being used in drug development for a wide variety of diseases and conditions, from Alzheimer's disease and multiple sclerosis to obesity, diabetes, hepatitis C, and HIV. Written by leading pioneers of Bayesian clinical trial designs, Bayesian Adaptive Methods for Clinical Trials explores the growing role of Bayesian thinking in the rapidly changing world of clinical trial analysis. The book first summarizes the current state of clinical trial design and analysis and introduces the main ideas and potential benefits of a Bayesian alternative. It then gives an overview of basic Bayesian methodological and computational tools needed for Bayesian clinical trials. With a focus on Bayesian designs that achieve good power and Type I error, the next chapters present Bayesian tools useful in early (Phase I) and middle (Phase II) clinical trials as well as two recent Bayesian adaptive Phase II studies: the BATTLE and ISP...

  17. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  18. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  19. Bayesian image restoration, using configurations

    OpenAIRE

    Thorarinsdottir, Thordis

    2006-01-01

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the re...

  20. Bayesian Cross-Entropy Reconstruction of Complex Images

    Science.gov (United States)

    1992-11-16

    over that in Fig. 4. The isolated point source is restored, and the edges are crisper , but suffer from overshoot artifacts just within the airplane...outline. 6. GMEMK-median window output from data in Fig. 3. The modulus is shown. The ref- erence image was as before. The edges are crisper than in Fig

  1. Marine Environmental History

    DEFF Research Database (Denmark)

    Poulsen, Bo

    2012-01-01

    This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...

  2. Evaluation of Bayesian tensor estimation using tensor coherence

    Science.gov (United States)

    Kim, Dae-Jin; Kim, In-Young; Jeong, Seok-Oh; Park, Hae-Jeong

    2009-06-01

    Fiber tractography, a unique and non-invasive method to estimate axonal fibers within white matter, constructs the putative streamlines from diffusion tensor MRI by interconnecting voxels according to the propagation direction defined by the diffusion tensor. This direction has uncertainties due to the properties of underlying fiber bundles, neighboring structures and image noise. Therefore, robust estimation of the diffusion direction is essential to reconstruct reliable fiber pathways. For this purpose, we propose a tensor estimation method using a Bayesian framework, which includes an a priori probability distribution based on tensor coherence indices, to utilize both the neighborhood direction information and the inertia moment as regularization terms. The reliability of the proposed tensor estimation was evaluated using Monte Carlo simulations in terms of accuracy and precision with four synthetic tensor fields at various SNRs and in vivo human data of brain and calf muscle. Proposed Bayesian estimation demonstrated the relative robustness to noise and the higher reliability compared to the simple tensor regression.

  3. Evaluation of Bayesian tensor estimation using tensor coherence

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dae-Jin; Park, Hae-Jeong [Laboratory of Molecular Neuroimaging Technology, Brain Korea 21 Project for Medical Science, Yonsei University, College of Medicine, Seoul (Korea, Republic of); Kim, In-Young [Department of Biomedical Engineering, Hanyang University, Seoul (Korea, Republic of); Jeong, Seok-Oh [Department of Statistics, Hankuk University of Foreign Studies, Yongin (Korea, Republic of)], E-mail: parkhj@yuhs.ac

    2009-06-21

    Fiber tractography, a unique and non-invasive method to estimate axonal fibers within white matter, constructs the putative streamlines from diffusion tensor MRI by interconnecting voxels according to the propagation direction defined by the diffusion tensor. This direction has uncertainties due to the properties of underlying fiber bundles, neighboring structures and image noise. Therefore, robust estimation of the diffusion direction is essential to reconstruct reliable fiber pathways. For this purpose, we propose a tensor estimation method using a Bayesian framework, which includes an a priori probability distribution based on tensor coherence indices, to utilize both the neighborhood direction information and the inertia moment as regularization terms. The reliability of the proposed tensor estimation was evaluated using Monte Carlo simulations in terms of accuracy and precision with four synthetic tensor fields at various SNRs and in vivo human data of brain and calf muscle. Proposed Bayesian estimation demonstrated the relative robustness to noise and the higher reliability compared to the simple tensor regression.

  4. Bayesian tomography and integrated data analysis in fusion diagnostics

    Science.gov (United States)

    Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.

    2016-11-01

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.

  5. Sparse Bayesian Learning for DOA Estimation with Mutual Coupling

    Directory of Open Access Journals (Sweden)

    Jisheng Dai

    2015-10-01

    Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.

  6. The Examination of Patient-Reported Outcomes and Postural Control Measures in Patients With and Without a History of ACL Reconstruction: A Case Control Study.

    Science.gov (United States)

    Hoch, Johanna M; Sinnott, Cori W; Robinson, Kendall P; Perkins, William O; Hartman, Jonathan W

    2018-03-08

    There is a lack of literature to support the diagnostic accuracy and cut-off scores of commonly used patient-reported outcome measures (PROMs) and clinician-oriented outcomes such as postural-control assessments (PCAs) when treating post-ACL reconstruction (ACLR) patients. These scores could help tailor treatments, enhance patient-centered care and may identify individuals in need of additional rehabilitation. To determine if differences in 4-PROMs and 3-PCAs exist between post-ACLR and healthy participants, and to determine the diagnostic accuracy and cut-off scores of these outcomes. Case control. Laboratory. A total of 20 post-ACLR and 40 healthy control participants. The participants completed 4-PROMs (the Disablement in the Physically Active Scale [DPA], The Fear-Avoidance Belief Questionnaire [FABQ], the Knee Osteoarthritis Outcomes Score [KOOS] subscales, and the Tampa Scale of Kinesiophobia [TSK-11]) and 3-PCAs (the Balance Error Scoring System [BESS], the modified Star Excursion Balance Test [SEBT], and static balance on an instrumented force plate). Mann-Whitney U tests examined differences between groups. Receiver operating characteristic (ROC) curves were employed to determine sensitivity and specificity. The Area Under the Curve (AUC) was calculated to determine the diagnostic accuracy of each instrument. The Youdin Index was used to determine cut-off scores. Alpha was set a priori at P < 0.05. There were significant differences between groups for all PROMs (P < 0.05). There were no differences in PCAs between groups. The cut-off scores should be interpreted with caution for some instruments, as the scores may not be clinically applicable. Post-ACLR participants have decreased self-reported function and health-related quality of life. The PROMs are capable of discriminating between groups. Clinicians should consider using the cut-off scores in clinical practice. Further use of the instruments to examine detriments after completion of standard

  7. History of human activity in last 800 years reconstructed from combined archive data and high-resolution analyses of varved lake sediments from Lake Czechowskie, Northern Poland

    Science.gov (United States)

    Słowiński, Michał; Tyszkowski, Sebastian; Ott, Florian; Obremska, Milena; Kaczmarek, Halina; Theuerkauf, Martin; Wulf, Sabine; Brauer, Achim

    2016-04-01

    The aim of the study was to reconstruct human and landscape development in the Tuchola Pinewoods (Northern Poland) during the last 800 years. We apply an approach that combines historic maps and documents with pollen data. Pollen data were obtained from varved lake sediments at a resolution of 5 years. The chronology of the sediment record is based on varve counting, AMS 14C dating, 137Cs activity concentration measurements and tephrochronology (Askja AD 1875). We applied the REVEALS model to translate pollen percentage data into regional plant abundances. The interpretation of the pollen record is furthermore based on pollen accumulation rate data. The pollen record and historic documents show similar trends in vegetation development. During the first phase (AD 1200-1412), the Lake Czechowskie area was still largely forested with Quercus, Carpinus and Pinus forests. Vegetation was more open during the second phase (AD 1412-1776), and reached maximum openness during the third phase (AD 1776-1905). Furthermore, intensified forest management led to a transformation from mixed to pine dominated forests during this period. Since the early 20th century, the forest cover increased again with dominance of the Scots pine in the stand. While pollen and historic data show similar trends, they differ substantially in the degree of openness during the four phases with pollen data commonly suggesting more open conditions. We discuss potential causes for this discrepancy, which include unsuitable parameters settings in REVEALS and unknown changes in forest structure. Using pollen accumulation data as a third proxy record we aim to identify the most probable causes. Finally, we discuss the observed vegetation change in relation the socio-economic development of the area. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution Analysis - ICLEA- of the Helmholtz Association and National Science Centre, Poland (grant No. 2011/01/B/ST10

  8. Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs

    OpenAIRE

    Mansinghka, Vikash K.; Kulkarni, Tejas D.; Perov, Yura N.; Tenenbaum, Joshua B.

    2013-01-01

    The idea of computer vision as the Bayesian inverse problem to computer graphics has a long history and an appealing elegance, but it has proved difficult to directly implement. Instead, most vision tasks are approached via complex bottom-up processing pipelines. Here we show that it is possible to write short, simple probabilistic graphics programs that define flexible generative models and to automatically invert them to interpret real-world images. Generative probabilistic graphics program...

  9. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  10. Bayesian microsaccade detection

    Science.gov (United States)

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  11. History of land use in India during 1880-2010: Large-scale land transformations reconstructed from satellite data and historical archives

    Science.gov (United States)

    Tian, Hanqin; Banger, Kamaljit; Bo, Tao; Dadhwal, Vinay K.

    2014-10-01

    In India, human population has increased six-fold from 200 million to 1200 million that coupled with economic growth has resulted in significant land use and land cover (LULC) changes during 1880-2010. However, large discrepancies in the existing LULC datasets have hindered our efforts to better understand interactions among human activities, climate systems, and ecosystem in India. In this study, we incorporated high-resolution remote sensing datasets from Resourcesat-1 and historical archives at district (N = 590) and state (N = 30) levels to generate LULC datasets at 5 arc minute resolution during 1880-2010 in India. Results have shown that a significant loss of forests (from 89 million ha to 63 million ha) has occurred during the study period. Interestingly, the deforestation rate was relatively greater under the British rule (1880-1950s) and early decades after independence, and then decreased after the 1980s due to government policies to protect the forests. In contrast to forests, cropland area has increased from 92 million ha to 140.1 million ha during 1880-2010. Greater cropland expansion has occurred during the 1950-1980s that coincided with the period of farm mechanization, electrification, and introduction of high yielding crop varieties as a result of government policies to achieve self-sufficiency in food production. The rate of urbanization was slower during 1880-1940 but significantly increased after the 1950s probably due to rapid increase in population and economic growth in India. Our study provides the most reliable estimations of historical LULC at regional scale in India. This is the first attempt to incorporate newly developed high-resolution remote sensing datasets and inventory archives to reconstruct the time series of LULC records for such a long period in India. The spatial and temporal information on LULC derived from this study could be used by ecosystem, hydrological, and climate modeling as well as by policy makers for assessing the

  12. Bayesian nonparametric dictionary learning for compressed sensing MRI.

    Science.gov (United States)

    Huang, Yue; Paisley, John; Lin, Qin; Ding, Xinghao; Fu, Xueyang; Zhang, Xiao-Ping

    2014-12-01

    We develop a Bayesian nonparametric model for reconstructing magnetic resonance images (MRIs) from highly undersampled k -space data. We perform dictionary learning as part of the image reconstruction process. To this end, we use the beta process as a nonparametric dictionary learning prior for representing an image patch as a sparse combination of dictionary elements. The size of the dictionary and patch-specific sparsity pattern are inferred from the data, in addition to other dictionary learning variables. Dictionary learning is performed directly on the compressed image, and so is tailored to the MRI being considered. In addition, we investigate a total variation penalty term in combination with the dictionary learning model, and show how the denoising property of dictionary learning removes dependence on regularization parameters in the noisy setting. We derive a stochastic optimization algorithm based on Markov chain Monte Carlo for the Bayesian model, and use the alternating direction method of multipliers for efficiently performing total variation minimization. We present empirical results on several MRI, which show that the proposed regularization framework can improve reconstruction accuracy over other methods.

  13. ICDP Project DeepCHALLA: Reconstructing 250,000 Years of Climate Change and Environmental History on the East African Equator

    Science.gov (United States)

    Wolff, C.; Verschuren, D.; Van Daele, M. E.; Waldmann, N.; Meyer, I.; Lane, C. S.; Van der Meeren, T.; Ombori, T.; Kasanzu, C.; Olago, D.

    2017-12-01

    Sediments on the bottom of Lake Challa, a 92-m deep crater lake on the border of Kenya and Tanzania near Mt. Kilimanjaro, contain a uniquely long and continuous record of past climate and environmental change in easternmost equatorial Africa. Supported in part by the International Continental Scientific Drilling Programme (ICDP), the DeepCHALLA project has now recovered this sediment record down to 214.8 m below the lake floor, with 100% recovery of the uppermost 121.3 m (the last 160 kyr BP) and ca.85% recovery of the older part of the sequence, down to the lowermost distinct reflector identified in seismic stratigraphy. This acoustic basement represents a ca.2-m thick layer of coarsely laminated, diatom-rich organic mud mixed with volcanic sand and silt deposited 250 kyr ago, overlying an estimated 20-30 m of unsampled lacustrine deposits representing the earliest phase of lake development. Down-hole logging produced profiles of in-situ sediment composition that confer an absolute depth- scale to both the recovered cores and the seismic stratigraphy. An estimated 74% of the recovered sequence is finely laminated (varved), and continuously so over the upper 72.3 m (the last 90 kyr). All other sections display at least cm-scale lamination, demonstrating persistence of a tranquil, profundal depositional environment throughout lake history. The sequence is interrupted only by 32 visible tephra layers 2 to 9 mm thick; and by several dozen fine-grained turbidites up to 108 cm thick, most of which are clearly bracketed between a non-erosive base and a diatom-laden cap. Tie points between sediment markers and the corresponding seismic reflectors support a preliminary age model inferring a near-constant rate of sediment accumulation over at least the last glacial cycle (140 kyr BP to present). This great time span combined with the exquisite temporal resolution of the Lake Challa sediments provides great opportunities to study past tropical climate dynamics at both short

  14. Kernel Bayesian ART and ARTMAP.

    Science.gov (United States)

    Masuyama, Naoki; Loo, Chu Kiong; Dawood, Farhan

    2018-02-01

    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Iterative image reconstruction in ECT

    International Nuclear Information System (INIS)

    Chintu Chen; Ordonez, C.E.; Wernick, M.N.; Aarsvold, J.N.; Gunter, D.L.; Wong, W.H.; Kapp, O.H.; Xiaolong Ouyang; Levenson, M.; Metz, C.E.

    1992-01-01

    A series of preliminary studies has been performed in the authors laboratories to explore the use of a priori information in Bayesian image restoration and reconstruction. One piece of a priori information is the fact that intensities of neighboring pixels tend to be similar if they belong to the same region within which similar tissue characteristics are exhibited. this property of local continuity can be modeled by the use of Gibbs priors, as first suggested by German and Geman. In their investigation, they also included line sites between each pair of neighboring pixels in the Gibbs prior and used discrete binary numbers to indicate the absence or presence of boundaries between regions. These two features of the a priori model permit averaging within boundaries of homogeneous regions to alleviate the degradation caused by Poisson noise. with the use of this Gibbs prior in combination with the technique of stochastic relaxation, German and Geman demonstrated that noise levels can be reduced significantly in 2-D image restoration. They have developed a Bayesian method that utilizes a Gibbs prior to describe the spatial correlation of neighboring regions and takes into account the effect of limited spatial resolution as well. The statistical framework of the proposed approach is based on the data augmentation scheme suggested by Tanner and Wong. Briefly outlined here, this Bayesian method is based on Geman and Geman's approach

  16. Bayesian analyses of Yemeni mitochondrial genomes suggest multiple migration events with Africa and Western Eurasia.

    Science.gov (United States)

    Vyas, Deven N; Kitchen, Andrew; Miró-Herrans, Aida T; Pearson, Laurel N; Al-Meeri, Ali; Mulligan, Connie J

    2016-03-01

    Anatomically, modern humans are thought to have migrated out of Africa ∼60,000 years ago in the first successful global dispersal. This initial migration may have passed through Yemen, a region that has experienced multiple migrations events with Africa and Eurasia throughout human history. We use Bayesian phylogenetics to determine how ancient and recent migrations have shaped Yemeni mitogenomic variation. We sequenced 113 mitogenomes from multiple Yemeni regions with a focus on haplogroups M, N, and L3(xM,N) as these groups have the oldest evolutionary history outside of Africa. We performed Bayesian evolutionary analyses to generate time-measured phylogenies calibrated by Neanderthal and Denisovan mitogenomes in order to determine the age of Yemeni-specific clades. As defined by Yemeni monophyly, Yemeni in situ evolution is limited to the Holocene or latest Pleistocene (ages of clades in subhaplogroups L3b1a1a, L3h2, L3x1, M1a1f, M1a5, N1a1a3, and N1a3 range from 2 to 14 kya) and is often situated within broader Horn of Africa/southern Arabia in situ evolution (L3h2, L3x1, M1a1f, M1a5, and N1a1a3 ages range from 7 to 29 kya). Five subhaplogroups show no monophyly and are candidates for Holocene migration into Yemen (L0a2a2a, L3d1a1a, L3i2, M1a1b, and N1b1a). Yemeni mitogenomes are largely the product of Holocene migration, and subsequent in situ evolution, from Africa and western Eurasia. However, we hypothesize that recent population movements may obscure the genetic signature of more ancient migrations. Additional research, e.g., analyses of Yemeni nuclear genetic data, is needed to better reconstruct the complex population and migration histories associated with Out of Africa. © 2015 Wiley Periodicals, Inc.

  17. Precipitation history of the central Atacama Desert since the Miocene as reconstructed from clay pan records of the Costal Cordillera/ N Chile

    Science.gov (United States)

    Wennrich, V.; Melles, M.; Diederich, J. L.; Fernández Galego, E.; Ritter, B.; Brill, D.; Niemann, K.; Rolf, C.; Dunai, T. J.

    2017-12-01

    Hyperaridity is a major limitation of Earth-surface processes and biological activity in the Atacama Desert of N Chile, one of the oldest and the driest deserts on Earth. But even the hyperarid core of the Atacama Desert of N Chile has experienced sever precipitation events, e.g., during the flash floods in 2015. On geological timescales, the overall aridity that is postulated to have lasted at least since the early Miocene was punctuated by distinct pluvial events. Such wetter conditions, e.g. during the Miocene, caused widespread lake-formation in the Central Depression and Coastal Cordillera, but also caused amplified surface processes, changes in vegetation dynamics, and enabled the dispersal of species. Unfortunately, due to the limited number and heterogeneous appearance of climate archives from the central Atacama, it's longer-scale precipitation history is still a matter of controversy. This study aims to study continuous longterm (Pleistocene-Miocene) paleoclimatic and environmental records from the hyperarid core of the Atacama Desert covering the last >10 Ma. Therefor we investigate clay pans records from endorheic basins in the Coastal Cordillera mostly formed by blocking of drainage by tectonic movement. The clay pans under study are located along a latitudinal transect across the hyperarid core of the Atacama, and thus, are assumed to have recorded local and regional precipitation variations on different timescales. The investigated sequences exhibit significant changes in the sedimentological, geochemical, and mineralogical properties due to changes in precipitation, but also in the weathering and erosion in the catchments. Diatom and phytolith remains preserved in these records clearly point to significant water bodies during the wettest periods and a significant vegetation cover. The results shed a new light on the timing, frequency, and the driving mechanisms of the intervening pluvial phases.

  18. Genomic Reconstruction of the History of Native Sheep Reveals the Peopling Patterns of Nomads and the Expansion of Early Pastoralism in East Asia.

    Science.gov (United States)

    Zhao, Yong-Xin; Yang, Ji; Lv, Feng-Hua; Hu, Xiao-Ju; Xie, Xing-Long; Zhang, Min; Li, Wen-Rong; Liu, Ming-Jun; Wang, Yu-Tao; Li, Jin-Quan; Liu, Yong-Gang; Ren, Yan-Ling; Wang, Feng; Hehua, EEr; Kantanen, Juha; Arjen Lenstra, Johannes; Han, Jian-Lin; Li, Meng-Hua

    2017-09-01

    China has a rich resource of native sheep (Ovis aries) breeds associated with historical movements of several nomadic societies. However, the history of sheep and the associated nomadic societies in ancient China remains poorly understood. Here, we studied the genomic diversity of Chinese sheep using genome-wide SNPs, mitochondrial and Y-chromosomal variations in > 1,000 modern samples. Population genomic analyses combined with archeological records and historical ethnic demographics data revealed genetic signatures of the origins, secondary expansions and admixtures, of Chinese sheep thereby revealing the peopling patterns of nomads and the expansion of early pastoralism in East Asia. Originating from the Mongolian Plateau ∼5,000‒5,700 years ago, Chinese sheep were inferred to spread in the upper and middle reaches of the Yellow River ∼3,000‒5,000 years ago following the expansions of the Di-Qiang people. Afterwards, sheep were then inferred to reach the Qinghai-Tibetan and Yunnan-Kweichow plateaus ∼2,000‒2,600 years ago by following the north-to-southwest routes of the Di-Qiang migration. We also unveiled two subsequent waves of migrations of fat-tailed sheep into northern China, which were largely commensurate with the migrations of ancestors of Hui Muslims eastward and Mongols southward during the 12th‒13th centuries. Furthermore, we revealed signs of argali introgression into domestic sheep, extensive historical mixtures among domestic populations and strong artificial selection for tail type and other traits, reflecting various breeding strategies by nomadic societies in ancient China. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  19. Genetic characterisation of Porcine circovirus type 2 (PCV2) strains from feral pigs in the Brazilian Pantanal: An opportunity to reconstruct the history of PCV2 evolution.

    Science.gov (United States)

    Franzo, Giovanni; Cortey, Martí; de Castro, Alessandra Marnie Martins Gomes; Piovezan, Ubiratan; Szabo, Matias Pablo Juan; Drigo, Michele; Segalés, Joaquim; Richtzenhain, Leonardo José

    2015-07-09

    Since its discovery, Porcine circovirus type 2 has emerged as one of the most relevant swine infectious diseases, causing relevant economic losses for the pig industry. While four genotypes were identified, only three (PCV2a, PCV2b and PCV2d) are currently circulating and display a worldwide distribution. Another genotype, PCV2c, has been described only once in Danish archive samples collected between 1980 and 1990. In addition to commercial pigs, PCV2 has been demonstrated to infect wild boars and other wild species, which can potentially serve as a reservoir for domestic populations. In this study, eight sequences obtained from feral pigs in the Pantanal region (Mato Grosso do Sul State, Brazil) were compared with reference sequences and other Brazilian sequences, and the results revealed remarkable genetic diversity, with all four genotypes currently recognised being detected (PCV2a, PCV2b, PCV2c and PCV2d). This finding represents a remarkable discovery, as it is the first detection of PCV2c since 1990 and the first-ever detection of PCV2c in live animals. The peculiar population history and ecological scenario of feral pigs in the Pantanal coupled with the complex, and still only partially known relationship of feral pigs with other PCV2 susceptible species (i.e., domestic pigs, wild boars and peccaries), open exciting questions concerning PCV2 origin and evolution. Overall, the results of the present study led us to form the following hypothesis: the PCV2 strains found in feral pigs may be the last descent of the strains that circulated among European pigs in the past, or they may have infected these feral pigs more recently through a bridge species. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Climate Reconstructions

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Paleoclimatology Program archives reconstructions of past climatic conditions derived from paleoclimate proxies, in addition to the Program's large holdings...

  1. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  2. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  3. A Bayesian Network View on Nested Effects Models

    Directory of Open Access Journals (Sweden)

    Fröhlich Holger

    2009-01-01

    Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.

  4. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  5. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary...... configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  6. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary...... configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  7. Bayesian variable selection in regression

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, T.J.; Beauchamp, J.J.

    1987-01-01

    This paper is concerned with the selection of subsets of ''predictor'' variables in a linear regression model for the prediction of a ''dependent'' variable. We take a Bayesian approach and assign a probability distribution to the dependent variable through a specification of prior distributions for the unknown parameters in the regression model. The appropriate posterior probabilities are derived for each submodel and methods are proposed for evaluating the family of prior distributions. Examples are given that show the application of the Bayesian methodology. 23 refs., 3 figs.

  8. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees a...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....... and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...

  9. Coupling erosion and topographic development in the rainiest place on Earth: Reconstructing the Shillong Plateau uplift history with in-situ cosmogenic 10Be

    Science.gov (United States)

    Rosenkranz, Ruben; Schildgen, Taylor; Wittmann, Hella; Spiegel, Cornelia

    2018-02-01

    The uplift of the Shillong Plateau, in northeast India between the Bengal floodplain and the Himalaya Mountains, has had a significant impact on regional precipitation patterns, strain partitioning, and the path of the Brahmaputra River. Today, the plateau receives the highest measured yearly rainfall in the world and is tectonically active, having hosted one of the strongest intra-plate earthquakes ever recorded. Despite the unique tectonic and climatic setting of this prominent landscape feature, its exhumation and surface uplift history are poorly constrained. We collected 14 detrital river sand and 3 bedrock samples from the southern margin of the Shillong Plateau to measure erosion rates using the terrestrial cosmogenic nuclide 10Be. The calculated bedrock erosion rates range from 2.0 to 5.6 m My-1, whereas catchment average erosion rates from detrital river sands range from 48 to 214 m My-1. These rates are surprisingly low in the context of steep, tectonically active slopes and extreme rainfall. Moreover, the highest among these rates, which occur on the low-relief plateau surface, appear to have been affected by anthropogenic land-use change. To determine the onset of surface uplift, we coupled the catchment averaged erosion rates with topographic analyses of the plateau's southern margin. We interpolated an inclined, pre-incision surface from minimally eroded remnants along the valley interfluves and calculated the eroded volume of the valleys carved beneath the surface. The missing volume was then divided by the volume flux derived from the erosion rates to obtain the onset of uplift. The results of this calculation, ranging from 3.0 to 5.0 Ma for individual valleys, are in agreement with several lines of stratigraphic evidence from the Brahmaputra and Bengal basin that constrain the onset of topographic uplift, specifically the onset of flexural loading and the transgression from deltaic to marine deposition. Ultimately, our data corroborate the

  10. Reconstruction of the Earthquake History of Limestone Fault Scarps in Knidos Fault Zone Using in-situ Chlorine-36 Exposure Dating and "R" Programming Language

    Science.gov (United States)

    Sahin, Sefa; Yildirim, Cengiz; Akif Sarikaya, Mehmet; Tuysuz, Okan; Genc, S. Can; Ersen Aksoy, Murat; Ertekin Doksanalti, Mustafa

    2016-04-01

    Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of the fault, topography, geographic location of the study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Recently developed models provides a method to infer timing of earthquakes and slip rates on limited scales by taking into account these parameters. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to generate cosmogenic chlorine-36 (36Cl) dating models. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. The main purpose of this study is to analyze factors affecting the production rates and amount of cosmogenic 36Cl nuclides concentration. Concentration of Cl36 isotopes are measured by AMS laboratories. Through the local production rates and concentration of the cosmic isotopes, we can calculate exposure ages of the samples. Recent research elucidated each step of the application of this method by the Matlab programming language (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models of Quaternary activity of the normal faults. We, however, wanted to build a user-friendly program through an open source programing language "R" (GNU Project) that might be able to help those without knowledge of complex math programming, making calculations as easy and understandable as

  11. The subjectivity of scientists and the Bayesian approach

    CERN Document Server

    Press, James S

    2001-01-01

    Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a

  12. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  13. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    Dimitrakakis, C.

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more

  14. The humble Bayesian : Model checking from a fully Bayesian perspective

    NARCIS (Netherlands)

    Morey, Richard D.; Romeijn, Jan-Willem; Rouder, Jeffrey N.

    Gelman and Shalizi (2012) criticize what they call the usual story in Bayesian statistics: that the distribution over hypotheses or models is the sole means of statistical inference, thus excluding model checking and revision, and that inference is inductivist rather than deductivist. They present

  15. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  16. Bayesian models in cognitive neuroscience: A tutorial

    NARCIS (Netherlands)

    O'Reilly, J.X.; Mars, R.B.

    2015-01-01

    This chapter provides an introduction to Bayesian models and their application in cognitive neuroscience. The central feature of Bayesian models, as opposed to other classes of models, is that Bayesian models represent the beliefs of an observer as probability distributions, allowing them to

  17. A Bayesian framework for risk perception

    NARCIS (Netherlands)

    van Erp, H.R.N.

    2017-01-01

    We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy

  18. Phylodynamic reconstruction of O CATHAY topotype foot-and-mouth disease virus epidemics in the Philippines.

    Science.gov (United States)

    Di Nardo, Antonello; Knowles, Nick J; Wadsworth, Jemma; Haydon, Daniel T; King, Donald P

    2014-08-24

    Reconstructing the evolutionary history, demographic signal and dispersal processes from viral genome sequences contributes to our understanding of the epidemiological dynamics underlying epizootic events. In this study, a Bayesian phylogenetic framework was used to explore the phylodynamics and spatio-temporal dispersion of the O CATHAY topotype of foot-and-mouth disease virus (FMDV) that caused epidemics in the Philippines between 1994 and 2005. Sequences of the FMDV genome encoding the VP1 showed that the O CATHAY FMD epizootic in the Philippines resulted from a single introduction and was characterised by three main transmission hubs in Rizal, Bulacan and Manila Provinces. From a wider regional perspective, phylogenetic reconstruction of all available O CATHAY VP1 nucleotide sequences identified three distinct sub-lineages associated with country-based clusters originating in Hong Kong Special Administrative Region (SAR), the Philippines and Taiwan. The root of this phylogenetic tree was located in Hong Kong SAR, representing the most likely source for the introduction of this lineage into the Philippines and Taiwan. The reconstructed O CATHAY phylodynamics revealed three chronologically distinct evolutionary phases, culminating in a reduction in viral diversity over the final 10 years. The analysis suggests that viruses from the O CATHAY topotype have been continually maintained within swine industries close to Hong Kong SAR, following the extinction of virus lineages from the Philippines and the reduced number of FMD cases in Taiwan.

  19. 87Sr/86Sr isotope ratio analysis by laser ablation MC-ICP-MS in scales, spines, and fin rays as a nonlethal alternative to otoliths for reconstructing fish life history

    Science.gov (United States)

    Willmes, Malte; Glessner, Justin J. G.; Carleton, Scott A.; Gerrity, Paul C.; Hobbs, James A.

    2016-01-01

    Strontium isotope ratios (87Sr/86Sr) in otoliths are a well-established tool to determine origins and movement patterns of fish. However, otolith extraction requires sacrificing fish, and when working with protected or endangered species, the use of nonlethal samples such as scales, spines, and fin rays is preferred. Unlike otoliths that are predominantly aragonite, these tissues are composed of biological apatite. Laser ablation multicollector inductively coupled plasma mass spectrometry (LA-MC-ICP-MS) analysis of biological apatite can induce significant interference on mass 87, causing inaccurate 87Sr/86Sr measurements. To quantify this interference, we applied LA-MC-ICP-MS to three marine samples (white seabass (Atractoscion nobilis) otolith; green sturgeon (Acipenser medirostris) pectoral fin ray; salmon shark (Lamna ditropis) tooth), and freshwater walleye (Sander vitreus) otoliths, scales, and spines). Instrument conditions that maximize signal intensity resulted in elevated 87Sr/86Sr isotope ratios in the bioapatite samples, related to a polyatomic interference (40Ca31P16O, 40Ar31P16O). Retuning instrument conditions to reduce oxide levels removed this interference, resulting in accurate 87Sr/86Sr ratios across all tissue samples. This method provides a novel, nonlethal alternative to otolith analysis to reconstruct fish life histories.

  20. Human migration patterns in Yemen and implications for reconstructing prehistoric population movements.

    Directory of Open Access Journals (Sweden)

    Aida T Miró-Herrans

    Full Text Available Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102 and mean and median distances of migration (96 km and 26 km for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing

  1. Human migration patterns in Yemen and implications for reconstructing prehistoric population movements.

    Science.gov (United States)

    Miró-Herrans, Aida T; Al-Meeri, Ali; Mulligan, Connie J

    2014-01-01

    Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation) or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102) and mean and median distances of migration (96 km and 26 km) for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing country and these

  2. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  3. Bayesian networks in levee reliability

    NARCIS (Netherlands)

    Roscoe, K.; Hanea, A.

    2015-01-01

    We applied a Bayesian network to a system of levees for which the results of traditional reliability analysis showed high failure probabilities, which conflicted with the intuition and experience of those managing the levees. We made use of forty proven strength observations - high water levels with

  4. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic...... dimensionality. The built classi er is tested on standard and non-standard images...

  5. Computational Neuropsychology and Bayesian Inference.

    Science.gov (United States)

    Parr, Thomas; Rees, Geraint; Friston, Karl J

    2018-01-01

    Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  6. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  7. Simultaneous reconstruction and segmentation of CT scans with shadowed data

    DEFF Research Database (Denmark)

    Lauze, Francois Bernard; Quéau, Yvain; Plenge, Esben

    2017-01-01

    We propose a variational approach for simultaneous reconstruction and multiclass segmentation of X-ray CT images, with limited field of view and missing data. We propose a simple energy minimisation approach, loosely based on a Bayesian rationale. The resulting non convex problem is solved...

  8. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  9. Bayesian networks for evaluation of evidence from forensic entomology.

    Science.gov (United States)

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  10. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  11. Polytomies and Bayesian phylogenetic inference.

    Science.gov (United States)

    Lewis, Paul O; Holder, Mark T; Holsinger, Kent E

    2005-04-01

    Bayesian phylogenetic analyses are now very popular in systematics and molecular evolution because they allow the use of much more realistic models than currently possible with maximum likelihood methods. There are, however, a growing number of examples in which large Bayesian posterior clade probabilities are associated with very short branch lengths and low values for non-Bayesian measures of support such as nonparametric bootstrapping. For the four-taxon case when the true tree is the star phylogeny, Bayesian analyses become increasingly unpredictable in their preference for one of the three possible resolved tree topologies as data set size increases. This leads to the prediction that hard (or near-hard) polytomies in nature will cause unpredictable behavior in Bayesian analyses, with arbitrary resolutions of the polytomy receiving very high posterior probabilities in some cases. We present a simple solution to this problem involving a reversible-jump Markov chain Monte Carlo (MCMC) algorithm that allows exploration of all of tree space, including unresolved tree topologies with one or more polytomies. The reversible-jump MCMC approach allows prior distributions to place some weight on less-resolved tree topologies, which eliminates misleadingly high posteriors associated with arbitrary resolutions of hard polytomies. Fortunately, assigning some prior probability to polytomous tree topologies does not appear to come with a significant cost in terms of the ability to assess the level of support for edges that do exist in the true tree. Methods are discussed for applying arbitrary prior distributions to tree topologies of varying resolution, and an empirical example showing evidence of polytomies is analyzed and discussed.

  12. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  13. Efficient Bayesian inference under the structured coalescent.

    Science.gov (United States)

    Vaughan, Timothy G; Kühnert, Denise; Popinga, Alex; Welch, David; Drummond, Alexei J

    2014-08-15

    Population structure significantly affects evolutionary dynamics. Such structure may be due to spatial segregation, but may also reflect any other gene-flow-limiting aspect of a model. In combination with the structured coalescent, this fact can be used to inform phylogenetic tree reconstruction, as well as to infer parameters such as migration rates and subpopulation sizes from annotated sequence data. However, conducting Bayesian inference under the structured coalescent is impeded by the difficulty of constructing Markov Chain Monte Carlo (MCMC) sampling algorithms (samplers) capable of efficiently exploring the state space. In this article, we present a new MCMC sampler capable of sampling from posterior distributions over structured trees: timed phylogenetic trees in which lineages are associated with the distinct subpopulation in which they lie. The sampler includes a set of MCMC proposal functions that offer significant mixing improvements over a previously published method. Furthermore, its implementation as a BEAST 2 package ensures maximum flexibility with respect to model and prior specification. We demonstrate the usefulness of this new sampler by using it to infer migration rates and effective population sizes of H3N2 influenza between New Zealand, New York and Hong Kong from publicly available hemagglutinin (HA) gene sequences under the structured coalescent. The sampler has been implemented as a publicly available BEAST 2 package that is distributed under version 3 of the GNU General Public License at http://compevol.github.io/MultiTypeTree. © The Author 2014. Published by Oxford University Press.

  14. Inside the Melanoplinae: new molecular evidence for the evolutionary history of the Eurasian Podismini (Orthoptera: Acrididae).

    Science.gov (United States)

    Chintauan-Marquier, Ioana C; Amédégnato, Christiane; Nichols, Richard A; Pompanon, François; Grandcolas, Philippe; Desutter-Grandcolas, Laure

    2014-02-01

    The Podismini are melanopline grasshoppers with a Holarctic distribution and well represented in the Eurasian fauna. To investigate their controversial taxonomy and evolutionary history, we studied 86%, 78% and 33% respectively of the Eurasian, European and Asian Palaearctic genera (Otte, 1995; Eades et al., 2013). We reconstructed parsimony, maximum likelihood and Bayesian phylogenies using fragments of four genes (ITS1, 16S, 12S, CO2). We applied a Bayesian molecular clock to estimate the times of species divergence, and the event-based parsimony method to depict the biogeographic framework of the diversification. Our results suggest that the selected Eurasian Podismini constitute a monophyletic group inside the Melanoplinae, provided it includes the North American genus Phaulotettix. The clades proposed by the present study inside the Podismini do not fit the older morphological or cytological classifications, but are in agreement with more recent proposals. Furthermore, our results can be explained by a plausible biogeographic history in which the present geographical distribution of the Eurasian Podismini resulted from known changes, to the Cenozoic climate and vegetation, induced by major geological events including the genesis of high mountain chains (e.g., Himalayas, Altay, Alps) and large deserts (e.g., Gobi, Karakoum, Taklamakan), and the opening of marginal seas (e.g., Bering, Japanese and Yellow Seas). Copyright © 2014. Published by Elsevier Inc.

  15. A full-capture Hierarchical Bayesian model of Pollock's Closed Robust Design and application to dolphins

    Directory of Open Access Journals (Sweden)

    Robert William Rankin

    2016-03-01

    Full Text Available We present a Hierarchical Bayesian version of Pollock's Closed Robust Design for studying the survival, temporary-migration, and abundance of marked animals. Through simulations and analyses of a bottlenose dolphin photo-identification dataset, we compare several estimation frameworks, including Maximum Likelihood estimation (ML, model-averaging by AICc, as well as Bayesian and Hierarchical Bayesian (HB procedures. Our results demonstrate a number of advantages of the Bayesian framework over other popular methods. First, for simple fixed-effect models, we show the near-equivalence of Bayesian and ML point-estimates and confidence/credibility intervals. Second, we demonstrate how there is an inherent correlation among temporary-migration and survival parameter estimates in the PCRD, and while this can lead to serious convergence issues and singularities among MLEs, we show that the Bayesian estimates were more reliable. Third, we demonstrate that a Hierarchical Bayesian model with carefully thought-out hyperpriors, can lead to similar parameter estimates and conclusions as multi-model inference by AICc model-averaging. This latter point is especially interesting for mark-recapture practitioners, for whom model-uncertainty and multi-model inference have become a major preoccupation. Lastly, we extend the Hierarchical Bayesian PCRD to include full-capture histories (i.e., by modelling a recruitment process and individual-level heterogeneity in detection probabilities, which can have important consequences for the range of phenomena studied by the PCRD, as well as lead to large differences in abundance estimates. For example, we estimate 8%-24% more bottlenose dolphins in the western gulf of Shark Bay than previously estimated by ML and AICc-based model-averaging. Other important extensions are discussed. Our Bayesian PCRD models are written in the BUGS-like JAGS language for easy dissemination and customization by the community of capture

  16. ACL Reconstruction

    Science.gov (United States)

    ... in moderate exercise and recreational activities, or play sports that put less stress on the knees. ACL reconstruction is generally recommended if: You're an athlete and want to continue in your sport, especially if the sport involves jumping, cutting or ...

  17. Project Reconstruct.

    Science.gov (United States)

    Helisek, Harriet; Pratt, Donald

    1994-01-01

    Presents a project in which students monitor their use of trash, input and analyze information via a database and computerized graphs, and "reconstruct" extinct or endangered animals from recyclable materials. The activity was done with second-grade students over a period of three to four weeks. (PR)

  18. Reconstructing the Limfjord’s history

    DEFF Research Database (Denmark)

    Philippsen, Bente

    The Limfjord is a sound in Northern Jutland, Denmark, connecting the North Sea with the Kattegatt. The complex interplay of eustatic sea level changes and isostatic land-rise caused the relative sea level of the region to fluctuate throughout the later part of the Holocene. Consequently, the regi...

  19. Bayesian Model Averaging for Propensity Score Analysis.

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2014-01-01

    This article considers Bayesian model averaging as a means of addressing uncertainty in the selection of variables in the propensity score equation. We investigate an approximate Bayesian model averaging approach based on the model-averaged propensity score estimates produced by the R package BMA but that ignores uncertainty in the propensity score. We also provide a fully Bayesian model averaging approach via Markov chain Monte Carlo sampling (MCMC) to account for uncertainty in both parameters and models. A detailed study of our approach examines the differences in the causal estimate when incorporating noninformative versus informative priors in the model averaging stage. We examine these approaches under common methods of propensity score implementation. In addition, we evaluate the impact of changing the size of Occam's window used to narrow down the range of possible models. We also assess the predictive performance of both Bayesian model averaging propensity score approaches and compare it with the case without Bayesian model averaging. Overall, results show that both Bayesian model averaging propensity score approaches recover the treatment effect estimates well and generally provide larger uncertainty estimates, as expected. Both Bayesian model averaging approaches offer slightly better prediction of the propensity score compared with the Bayesian approach with a single propensity score equation. Covariate balance checks for the case study show that both Bayesian model averaging approaches offer good balance. The fully Bayesian model averaging approach also provides posterior probability intervals of the balance indices.

  20. Pedestrian dynamics via Bayesian networks

    Science.gov (United States)

    Venkat, Ibrahim; Khader, Ahamad Tajudin; Subramanian, K. G.

    2014-06-01

    Studies on pedestrian dynamics have vital applications in crowd control management relevant to organizing safer large scale gatherings including pilgrimages. Reasoning pedestrian motion via computational intelligence techniques could be posed as a potential research problem within the realms of Artificial Intelligence. In this contribution, we propose a "Bayesian Network Model for Pedestrian Dynamics" (BNMPD) to reason the vast uncertainty imposed by pedestrian motion. With reference to key findings from literature which include simulation studies, we systematically identify: What are the various factors that could contribute to the prediction of crowd flow status? The proposed model unifies these factors in a cohesive manner using Bayesian Networks (BNs) and serves as a sophisticated probabilistic tool to simulate vital cause and effect relationships entailed in the pedestrian domain.

  1. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...

  2. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  3. Bayesian Inference on Proportional Elections

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  4. Bayesian analyses of cognitive architecture.

    Science.gov (United States)

    Houpt, Joseph W; Heathcote, Andrew; Eidels, Ami

    2017-06-01

    The question of cognitive architecture-how cognitive processes are temporally organized-has arisen in many areas of psychology. This question has proved difficult to answer, with many proposed solutions turning out to be spurious. Systems factorial technology (Townsend & Nozawa, 1995) provided the first rigorous empirical and analytical method of identifying cognitive architecture, using the survivor interaction contrast (SIC) to determine when people are using multiple sources of information in parallel or in series. Although the SIC is based on rigorous nonparametric mathematical modeling of response time distributions, for many years inference about cognitive architecture has relied solely on visual assessment. Houpt and Townsend (2012) recently introduced null hypothesis significance tests, and here we develop both parametric and nonparametric (encompassing prior) Bayesian inference. We show that the Bayesian approaches can have considerable advantages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  6. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  7. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  8. Reliability analysis with Bayesian networks

    OpenAIRE

    Zwirglmaier, Kilian Martin

    2017-01-01

    Bayesian networks (BNs) represent a probabilistic modeling tool with large potential for reliability engineering. While BNs have been successfully applied to reliability engineering, there are remaining issues, some of which are addressed in this work. Firstly a classification of BN elicitation approaches is proposed. Secondly two approximate inference approaches, one of which is based on discretization and the other one on sampling, are proposed. These approaches are applicable to hybrid/con...

  9. Interim Bayesian Persuasion: First Steps

    OpenAIRE

    Perez, Eduardo

    2015-01-01

    This paper makes a first attempt at building a theory of interim Bayesian persuasion. I work in a minimalist model where a low or high type sender seeks validation from a receiver who is willing to validate high types exclusively. After learning her type, the sender chooses a complete conditional information structure for the receiver from a possibly restricted feasible set. I suggest a solution to this game that takes into account the signaling potential of the sender's choice.

  10. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    . This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....

  11. Computational Neuropsychology and Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Thomas Parr

    2018-02-01

    Full Text Available Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world. This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  12. Bayesian methods applied to GWAS.

    Science.gov (United States)

    Fernando, Rohan L; Garrick, Dorian

    2013-01-01

    Bayesian multiple-regression methods are being successfully used for genomic prediction and selection. These regression models simultaneously fit many more markers than the number of observations available for the analysis. Thus, the Bayes theorem is used to combine prior beliefs of marker effects, which are expressed in terms of prior distributions, with information from data for inference. Often, the analyses are too complex for closed-form solutions and Markov chain Monte Carlo (MCMC) sampling is used to draw inferences from posterior distributions. This chapter describes how these Bayesian multiple-regression analyses can be used for GWAS. In most GWAS, false positives are controlled by limiting the genome-wise error rate, which is the probability of one or more false-positive results, to a small value. As the number of test in GWAS is very large, this results in very low power. Here we show how in Bayesian GWAS false positives can be controlled by limiting the proportion of false-positive results among all positives to some small value. The advantage of this approach is that the power of detecting associations is not inversely related to the number of markers.

  13. Nonlinear reconstruction

    Science.gov (United States)

    Zhu, Hong-Ming; Yu, Yu; Pen, Ue-Li; Chen, Xuelei; Yu, Hao-Ran

    2017-12-01

    We present a direct approach to nonparametrically reconstruct the linear density field from an observed nonlinear map. We solve for the unique displacement potential consistent with the nonlinear density and positive definite coordinate transformation using a multigrid algorithm. We show that we recover the linear initial conditions up to the nonlinear scale (rδrδL>0.5 for k ≲1 h /Mpc ) with minimal computational cost. This reconstruction approach generalizes the linear displacement theory to fully nonlinear fields, potentially substantially expanding the baryon acoustic oscillations and redshift space distortions information content of dense large scale structure surveys, including for example SDSS main sample and 21 cm intensity mapping initiatives.

  14. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  15. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  16. Bayesian Posterior Distributions Without Markov Chains

    OpenAIRE

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.

    2012-01-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...

  17. Robustness of ancestral sequence reconstruction to phylogenetic uncertainty.

    Science.gov (United States)

    Hanson-Smith, Victor; Kolaczkowski, Bryan; Thornton, Joseph W

    2010-09-01

    Ancestral sequence reconstruction (ASR) is widely used to formulate and test hypotheses about the sequences, functions, and structures of ancient genes. Ancestral sequences are usually inferred from an alignment of extant sequences using a maximum likelihood (ML) phylogenetic algorithm, which calculates the most likely ancestral sequence assuming a probabilistic model of sequence evolution and a specific phylogeny--typically the tree with the ML. The true phylogeny is seldom known with certainty, however. ML methods ignore this uncertainty, whereas Bayesian methods incorporate it by integrating the likelihood of each ancestral state over a distribution of possible trees. It is not known whether Bayesian approaches to phylogenetic uncertainty improve the accuracy of inferred ancestral sequences. Here, we use simulation-based experiments under both simplified and empirically derived conditions to compare the accuracy of ASR carried out using ML and Bayesian approaches. We show that incorporating phylogenetic uncertainty by integrating over topologies very rarely changes the inferred ancestral state and does not improve the accuracy of the reconstructed ancestral sequence. Ancestral state reconstructions are robust to uncertainty about the underlying tree because the conditions that produce phylogenetic uncertainty also make the ancestral state identical across plausible trees; conversely, the conditions under which different phylogenies yield different inferred ancestral states produce little or no ambiguity about the true phylogeny. Our results suggest that ML can produce accurate ASRs, even in the face of phylogenetic uncertainty. Using Bayesian integration to incorporate this uncertainty is neither necessary nor beneficial.

  18. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  19. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learn....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....

  20. Reconstructing the post-LGM decay of the Eurasian Ice Sheets with Ice Sheet Models; data-model comparison and focus on the Storfjorden (Svalbard) ice stream dynamics history

    Science.gov (United States)

    Petrini, Michele; Kirchner, Nina; Colleoni, Florence; Camerlenghi, Angelo; Rebesco, Michele; Lucchi, Renata G.; Forte, Emanuele; Colucci, Renato R.

    2017-04-01

    The challenge of reconstructing palaeo-ice sheets past growth and decay represent a critical task to better understand mechanisms of present and future global climate change. Last Glacial Maximum (LGM), and the subsequent deglaciation until Pre-Industrial time (PI) represent an excellent testing ground for numerical Ice Sheet Models (ISMs), due to the abundant data available that can be used in an ISM as boundary conditions, forcings or constraints to test the ISMs results. In our study, we simulate with ISMs the post-LGM decay of the Eurasian Ice Sheets, with a focus on the marine-based Svalbard-Barents Sea-Kara Sea Ice Sheet. In particular, we aim to reconstruct the Storfjorden ice stream dynamics history by comparing the model results with the marine geological data (MSGLs, GZWs, sediment cores analysis) available from the area, e.g., Pedrosa et al. 2011, Rebesco et al. 2011, 2013, Lucchi et al. 2013. Two hybrid SIA/SSA ISMs are employed, GRISLI, Ritz et al. 2001, and PSU, Pollard&DeConto 2012. These models differ mainly in the complexity with which grounding line migration is treated. Climate forcing is interpolated by means of climate indexes between LGM and PI climate. Regional climate indexes are constructed based on the non-accelerated deglaciation transient experiment carried out with CCSM3, Liu et al. 2009. Indexes representative of the climate evolution over Siberia, Svalbard and Scandinavia are employed. The impact of such refined representation as opposed to the common use of the NGRIP δ18O index for transient experiments is analysed. In this study, the ice-ocean interaction is crucial to reconstruct the Storfjorden ice stream dynamics history. To investigate the sensitivity of the ice shelf/stream retreat to ocean temperature, we allow for a space-time variation of basal melting under the ice shelves by testing two-equations implementations based on Martin et al. 2011 forced with simulated ocean temperature and salinity from the TraCE-21ka coupled

  1. ABCtoolbox: a versatile toolkit for approximate Bayesian computations.

    Science.gov (United States)

    Wegmann, Daniel; Leuenberger, Christoph; Neuenschwander, Samuel; Excoffier, Laurent

    2010-03-04

    The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  2. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  3. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  4. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  5. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    , and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... settings, where cues shape expectations about a small number of upcoming stimuli and thus convey "prior" information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its...

  6. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    Abstract. In this work, robust Bayesian estimation of the generalized Pareto distribution is proposed. The methodology is presented in terms of oscillation of posterior risks of the Bayesian estimators. By using a Monte Carlo simulation study, we show that, under a suitable generalized loss function, we can obtain a robust ...

  7. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  8. Using Bayesian belief networks in adaptive management.

    Science.gov (United States)

    J.B. Nyberg; B.G. Marcot; R. Sulyma

    2006-01-01

    Bayesian belief and decision networks are relatively new modeling methods that are especially well suited to adaptive-management applications, but they appear not to have been widely used in adaptive management to date. Bayesian belief networks (BBNs) can serve many purposes for practioners of adaptive management, from illustrating system relations conceptually to...

  9. Calibration in a Bayesian modelling framework

    NARCIS (Netherlands)

    Jansen, M.J.W.; Hagenaars, T.H.J.

    2004-01-01

    Bayesian statistics may constitute the core of a consistent and comprehensive framework for the statistical aspects of modelling complex processes that involve many parameters whose values are derived from many sources. Bayesian statistics holds great promises for model calibration, provides the

  10. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.; Collaboration, ALICE

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  11. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    Zajdel, W.P.; Kröse, B.J.A.; Blockeel, H.; Denecker, M.

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a

  12. Bayesian learning theory applied to human cognition.

    Science.gov (United States)

    Jacobs, Robert A; Kruschke, John K

    2011-01-01

    Probabilistic models based on Bayes' rule are an increasingly popular approach to understanding human cognition. Bayesian models allow immense representational latitude and complexity. Because they use normative Bayesian mathematics to process those representations, they define optimal performance on a given task. This article focuses on key mechanisms of Bayesian information processing, and provides numerous examples illustrating Bayesian approaches to the study of human cognition. We start by providing an overview of Bayesian modeling and Bayesian networks. We then describe three types of information processing operations-inference, parameter learning, and structure learning-in both Bayesian networks and human cognition. This is followed by a discussion of the important roles of prior knowledge and of active learning. We conclude by outlining some challenges for Bayesian models of human cognition that will need to be addressed by future research. WIREs Cogn Sci 2011 2 8-21 DOI: 10.1002/wcs.80 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  13. Properties of the Bayesian Knowledge Tracing Model

    Science.gov (United States)

    van de Sande, Brett

    2013-01-01

    Bayesian Knowledge Tracing is used very widely to model student learning. It comes in two different forms: The first form is the Bayesian Knowledge Tracing "hidden Markov model" which predicts the probability of correct application of a skill as a function of the number of previous opportunities to apply that skill and the model…

  14. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    and secondly, to gain efficiency during modification of an object oriented Bayesian network. To accomplish these two goals we have exploited a mechanism allowing local triangulation of instances to develop a method for updating the junction trees associated with object oriented Bayesian networks in highly...

  15. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  16. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  17. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  18. Flexible Bayesian Human Fecundity Models.

    Science.gov (United States)

    Kim, Sungduk; Sundaram, Rajeshwari; Buck Louis, Germaine M; Pyper, Cecilia

    2012-12-01

    Human fecundity is an issue of considerable interest for both epidemiological and clinical audiences, and is dependent upon a couple's biologic capacity for reproduction coupled with behaviors that place a couple at risk for pregnancy. Bayesian hierarchical models have been proposed to better model the conception probabilities by accounting for the acts of intercourse around the day of ovulation, i.e., during the fertile window. These models can be viewed in the framework of a generalized nonlinear model with an exponential link. However, a fixed choice of link function may not always provide the best fit, leading to potentially biased estimates for probability of conception. Motivated by this, we propose a general class of models for fecundity by relaxing the choice of the link function under the generalized nonlinear model framework. We use a sample from the Oxford Conception Study (OCS) to illustrate the utility and fit of this general class of models for estimating human conception. Our findings reinforce the need for attention to be paid to the choice of link function in modeling conception, as it may bias the estimation of conception probabilities. Various properties of the proposed models are examined and a Markov chain Monte Carlo sampling algorithm was developed for implementing the Bayesian computations. The deviance information criterion measure and logarithm of pseudo marginal likelihood are used for guiding the choice of links. The supplemental material section contains technical details of the proof of the theorem stated in the paper, and contains further simulation results and analysis.

  19. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  20. BELM: Bayesian extreme learning machine.

    Science.gov (United States)

    Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J

    2011-03-01

    The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.

  1. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  2. Bayesian natural language semantics and pragmatics

    CERN Document Server

    Zeevat, Henk

    2015-01-01

    The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.

  3. Crystal structure prediction accelerated by Bayesian optimization

    Science.gov (United States)

    Yamashita, Tomoki; Sato, Nobuya; Kino, Hiori; Miyake, Takashi; Tsuda, Koji; Oguchi, Tamio

    2018-01-01

    We propose a crystal structure prediction method based on Bayesian optimization. Our method is classified as a selection-type algorithm which is different from evolution-type algorithms such as an evolutionary algorithm and particle swarm optimization. Crystal structure prediction with Bayesian optimization can efficiently select the most stable structure from a large number of candidate structures with a lower number of searching trials using a machine learning technique. Crystal structure prediction using Bayesian optimization combined with random search is applied to known systems such as NaCl and Y2Co17 to discuss the efficiency of Bayesian optimization. These results demonstrate that Bayesian optimization can significantly reduce the number of searching trials required to find the global minimum structure by 30-40% in comparison with pure random search, which leads to much less computational cost.

  4. Phylogenetic systematics of the colorful, cyanide-producing millipedes of Appalachia (Polydesmida, Xystodesmidae, Apheloriini) using a total evidence Bayesian approach.

    Science.gov (United States)

    Marek, Paul E; Bond, Jason E

    2006-12-01

    Here, we provide an exemplar-approach phylogeny of the xystodesmid millipede tribe Apheloriini with a focus on genus-group relationships-particularly of the genus Brachoria. Exemplars for the phylogenetic analysis were chosen to represent the maximum breadth of morphological diversity within all nominal genera in the tribe Apheloriini, and to broadly sample the genus Brachoria. In addition, three closely related tribes were used (Rhysodesmini, Nannariini, and Pachydesmini). Morphological and DNA sequence data were scored for Bayesian inference of phylogeny. Phylogenetic analysis resulted in polyphyletic genera Brachoria and Sigmoria, a monophyletic Apheloriini, and a "southern clade" that contains most of the tribal species diversity. We used this phylogeny to track morphological character histories and reconstruct ancestral states using stochastic character mapping. Based on the findings from the character mapping study, the diagnostic feature of the genus Brachoria, the cingulum, evolved independently in two lineages. We compared our phylogeny against prior classifications using Bayes factor hypothesis-testing and found that our phylogenetic hypothesis is inconsistent with the previous hypotheses underlying the most recent classification. With our preferred total-evidence phylogeny as a framework for taxonomic modifications, we describe a new genus, Appalachioria; supply phylogenetic diagnoses of monophyletic taxa; and provide a phylogeny-based classification for the tribe Apheloriini.

  5. Clinical outcome prediction in aneurysmal subarachnoid hemorrhage using Bayesian neural networks with fuzzy logic inferences.

    Science.gov (United States)

    Lo, Benjamin W Y; Macdonald, R Loch; Baker, Andrew; Levine, Mitchell A H

    2013-01-01

    The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH). The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients). Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs). Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique) denoted cut-off points for poor prognosis at greater than 2.5 clusters. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  6. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    Directory of Open Access Journals (Sweden)

    Benjamin W. Y. Lo

    2013-01-01

    Full Text Available Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH. Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients. Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs. Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication.

  7. Group Tracking of Space Objects within Bayesian Framework

    Directory of Open Access Journals (Sweden)

    Huang Jian

    2013-03-01

    Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.

  8. THz-SAR Vibrating Target Imaging via the Bayesian Method

    Directory of Open Access Journals (Sweden)

    Bin Deng

    2017-01-01

    Full Text Available Target vibration bears important information for target recognition, and terahertz, due to significant micro-Doppler effects, has strong advantages for remotely sensing vibrations. In this paper, the imaging characteristics of vibrating targets with THz-SAR are at first analyzed. An improved algorithm based on an excellent Bayesian approach, that is, the expansion-compression variance-component (ExCoV method, has been proposed for reconstructing scattering coefficients of vibrating targets, which provides more robust and efficient initialization and overcomes the deficiencies of sidelobes as well as artifacts arising from the traditional correlation method. A real vibration measurement experiment of idle cars was performed to validate the range model. Simulated SAR data of vibrating targets and a tank model in a real background in 220 GHz show good performance at low SNR. Rapidly evolving high-power terahertz devices will offer viable THz-SAR application at a distance of several kilometers.

  9. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  10. Breast reconstruction - implants

    Science.gov (United States)

    Breast implants surgery; Mastectomy - breast reconstruction with implants; Breast cancer - breast reconstruction with implants ... to close the skin flaps. Breast reconstruction with implants is usually done in two stages, or surgeries. ...

  11. Breast Reconstruction with Implants

    Science.gov (United States)

    ... What you can expect Breast reconstruction begins with placement of a breast implant or tissue expander, either at the time of your mastectomy surgery (immediate reconstruction) or during a later procedure (delayed reconstruction). ...

  12. Bayesian Approach to Inverse Problems

    CERN Document Server

    2008-01-01

    Many scientific, medical or engineering problems raise the issue of recovering some physical quantities from indirect measurements; for instance, detecting or quantifying flaws or cracks within a material from acoustic or electromagnetic measurements at its surface is an essential problem of non-destructive evaluation. The concept of inverse problems precisely originates from the idea of inverting the laws of physics to recover a quantity of interest from measurable data.Unfortunately, most inverse problems are ill-posed, which means that precise and stable solutions are not easy to devise. Regularization is the key concept to solve inverse problems.The goal of this book is to deal with inverse problems and regularized solutions using the Bayesian statistical tools, with a particular view to signal and image estimation

  13. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  14. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...... sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning...... under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented on model construction and verification, modeling techniques and tricks, learning...

  15. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  16. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  17. Bayesian Kernel Mixtures for Counts.

    Science.gov (United States)

    Canale, Antonio; Dunson, David B

    2011-12-01

    Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online.

  18. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  19. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  20. Reconstructive Urology

    Directory of Open Access Journals (Sweden)

    Fikret Fatih Önol

    2014-11-01

    Full Text Available In the treatment of urethral stricture, Buccal Mucosa Graft (BMG and reconstruction is applied with different patch techniques. Recently often prefered, this approach is, in bulber urethra strictures of BMG’s; by “ventral onley”, in pendulous urethra because of thinner spingiosis body, which provides support and nutrition of graft; by means of “dorsal inley” being anastomosis. In the research that Cordon et al. did, they compared conventional BMJ “onley” urethroplast and “pseudo-spongioplasty” which base on periurethral vascular tissues to be nourished by closing onto graft. In repairment of front urethras that spongiosis supportive tissue is insufficient, this method is defined as peripheral dartos [çevre dartos?] and buck’s fascia being mobilized and being combined on BMG patch. Between the years 2007 and 2012, assessment of 56 patients with conventional “ventral onley” BMG urethroplast and 46 patients with “pseudo-spongioplasty” were reported to have similar success rates (80% to 84% in 3.5 year follow-up on average. While 74% of the patients that were applied pseudo-spongioplasty had disease present at distal urethra (pendulous, bulbopendulous, 82% of the patients which were applied conventional onley urethroplast had stricture at proximal (bulber urethra yet. Also lenght of the stricture at the pseudo-spongioplasty group was longer in a statistically significant way (5.8 cm to 4.7 cm on average, p=0.028. This study which Cordon et al. did, shows that conditions in which conventional sponjiyoplasti is not possible, periurethral vascular tissues are adequate to nourish BMG. Even it is an important technique in terms of bringing a new point of view to today’s practice, data especially about complications that may show up after pseudo-spongioplasty usage on long distal strictures (e.g. appearance of urethral diverticulum is not reported. Along with this we think that, providing an oppurtinity to patch directly

  1. Robust bayesian analysis of an autoregressive model with ...

    African Journals Online (AJOL)

    In this work, robust Bayesian analysis of the Bayesian estimation of an autoregressive model with exponential innovations is performed. Using a Bayesian robustness methodology, we show that, using a suitable generalized quadratic loss, we obtain optimal Bayesian estimators of the parameters corresponding to the ...

  2. CURRENT CONCEPTS IN ACL RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Freddie H. Fu

    2008-09-01

    Full Text Available Current Concepts in ACL Reconstruction is a complete reference text composed of the most thorough collection of topics on the ACL and its surgical reconstruction compiled, with contributions from some of the world's experts and most experienced ACL surgeons. Various procedures mentioned throughout the text are also demonstrated in an accompanying video CD-ROM. PURPOSE Composing a single, comprehensive and complete information source on ACL including basic sciences, clinical issues, latest concepts and surgical techniques, from evaluation to outcome, from history to future, editors and contributors have targeted to keep the audience pace with the latest concepts and techniques for the evaluation and the treatment of ACL injuries. FEATURES The text is composed of 27 chapters in 6 sections. The first section is mostly about basic sciences, also history of the ACL, imaging, clinical approach to adolescent and pediatric patients are subjected. In the second section, Graft Choices and Arthroscopy Portals for ACL Reconstruction are mentioned. The third section is about the technique and the outcome of the single-bundle ACL reconstruction. The fourth chapter includes the techniques and outcome of the double-bundle ACL reconstruction. In the fifth chapter revision, navigation technology, rehabilitation and the evaluation of the outcome of ACL reconstruction is subjected. The sixth/the last chapter is about the future advances to reach: What We Have Learned and the Future of ACL Reconstruction. AUDIENCE Orthopedic residents, sports traumatology and knee surgery fellows, orthopedic surgeons, also scientists in basic sciences or clinicians who are studying or planning a research on ACL forms the audience group of this book. ASSESSMENT This is the latest, the most complete and comprehensive textbook of ACL reconstruction produced by the editorial work up of two pioneer and masters "Freddie H. Fu MD and Steven B. Cohen MD" with the contribution of world

  3. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  4. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  5. Bayesian tomography by interacting Markov chains

    Science.gov (United States)

    Romary, T.

    2017-12-01

    In seismic tomography, we seek to determine the velocity of the undergound from noisy first arrival travel time observations. In most situations, this is an ill posed inverse problem that admits several unperfect solutions. Given an a priori distribution over the parameters of the velocity model, the Bayesian formulation allows to state this problem as a probabilistic one, with a solution under the form of a posterior distribution. The posterior distribution is generally high dimensional and may exhibit multimodality. Moreover, as it is known only up to a constant, the only sensible way to addressthis problem is to try to generate simulations from the posterior. The natural tools to perform these simulations are Monte Carlo Markov chains (MCMC). Classical implementations of MCMC algorithms generally suffer from slow mixing: the generated states are slow to enter the stationary regime, that is to fit the observations, and when one mode of the posterior is eventually identified, it may become difficult to visit others. Using a varying temperature parameter relaxing the constraint on the data may help to enter the stationary regime. Besides, the sequential nature of MCMC makes them ill fitted toparallel implementation. Running a large number of chains in parallel may be suboptimal as the information gathered by each chain is not mutualized. Parallel tempering (PT) can be seen as a first attempt to make parallel chains at different temperatures communicate but only exchange information between current states. In this talk, I will show that PT actually belongs to a general class of interacting Markov chains algorithm. I will also show that this class enables to design interacting schemes that can take advantage of the whole history of the chain, by authorizing exchanges toward already visited states. The algorithms will be illustrated with toy examples and an application to first arrival traveltime tomography.

  6. Bayesian estimation and modeling: Editorial to the second special issue on Bayesian data analysis.

    Science.gov (United States)

    Chow, Sy-Miin; Hoijtink, Herbert

    2017-12-01

    This editorial accompanies the second special issue on Bayesian data analysis published in this journal. The emphases of this issue are on Bayesian estimation and modeling. In this editorial, we outline the basics of current Bayesian estimation techniques and some notable developments in the statistical literature, as well as adaptations and extensions by psychological researchers to better tailor to the modeling applications in psychology. We end with a discussion on future outlooks of Bayesian data analysis in psychology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  7. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  8. Bayesian analysis for the social sciences

    CERN Document Server

    Jackman, Simon

    2009-01-01

    Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.

  9. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  10. Bayesian Statistics の源流

    OpenAIRE

    新家, 健精

    1991-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  11. An overview on Approximate Bayesian computation*

    Directory of Open Access Journals (Sweden)

    Baragatti Meïli

    2014-01-01

    Full Text Available Approximate Bayesian computation techniques, also called likelihood-free methods, are one of the most satisfactory approach to intractable likelihood problems. This overview presents recent results since its introduction about ten years ago in population genetics.

  12. Implementing the Bayesian paradigm in risk analysis

    International Nuclear Information System (INIS)

    Aven, T.; Kvaloey, J.T.

    2002-01-01

    The Bayesian paradigm comprises a unified and consistent framework for analyzing and expressing risk. Yet, we see rather few examples of applications where the full Bayesian setting has been adopted with specifications of priors of unknown parameters. In this paper, we discuss some of the practical challenges of implementing Bayesian thinking and methods in risk analysis, emphasizing the introduction of probability models and parameters and associated uncertainty assessments. We conclude that there is a need for a pragmatic view in order to 'successfully' apply the Bayesian approach, such that we can do the assignments of some of the probabilities without adopting the somewhat sophisticated procedure of specifying prior distributions of parameters. A simple risk analysis example is presented to illustrate ideas

  13. A Bayesian concept learning approach to crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, P.; Zilles, S.; Hamilton, H.J.

    2011-01-01

    We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...

  14. An Intuitive Dashboard for Bayesian Network Inference

    International Nuclear Information System (INIS)

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  15. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    .... In this approach, the source and target ontologies are first translated into Bayesian networks (BN); the concept mapping between the two ontologies are treated as evidential reasoning between the two translated BNs...

  16. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  17. Posterior consistency for Bayesian inverse problems through stability and regression results

    International Nuclear Information System (INIS)

    Vollmer, Sebastian J

    2013-01-01

    In the Bayesian approach, the a priori knowledge about the input of a mathematical model is described via a probability measure. The joint distribution of the unknown input and the data is then conditioned, using Bayes’ formula, giving rise to the posterior distribution on the unknown input. In this setting we prove posterior consistency for nonlinear inverse problems: a sequence of data is considered, with diminishing fluctuations around a single truth and it is then of interest to show that the resulting sequence of posterior measures arising from this sequence of data concentrates around the truth used to generate the data. Posterior consistency justifies the use of the Bayesian approach very much in the same way as error bounds and convergence results for regularization techniques do. As a guiding example, we consider the inverse problem of reconstructing the diffusion coefficient from noisy observations of the solution to an elliptic PDE in divergence form. This problem is approached by splitting the forward operator into the underlying continuum model and a simpler observation operator based on the output of the model. In general, these splittings allow us to conclude posterior consistency provided a deterministic stability result for the underlying inverse problem and a posterior consistency result for the Bayesian regression problem with the push-forward prior. Moreover, we prove posterior consistency for the Bayesian regression problem based on the regularity, the tail behaviour and the small ball probabilities of the prior. (paper)

  18. Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods

    Science.gov (United States)

    Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.

    2012-03-01

    In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.

  19. Assessing the accuracy of ancestral protein reconstruction methods.

    Directory of Open Access Journals (Sweden)

    Paul D Williams

    2006-06-01

    Full Text Available The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  20. Assessing the accuracy of ancestral protein reconstruction methods.

    Science.gov (United States)

    Williams, Paul D; Pollock, David D; Blackburne, Benjamin P; Goldstein, Richard A

    2006-06-23

    The phylogenetic inference of ancestral protein sequences is a powerful technique for the study of molecular evolution, but any conclusions drawn from such studies are only as good as the accuracy of the reconstruction method. Every inference method leads to errors in the ancestral protein sequence, resulting in potentially misleading estimates of the ancestral protein's properties. To assess the accuracy of ancestral protein reconstruction methods, we performed computational population evolution simulations featuring near-neutral evolution under purifying selection, speciation, and divergence using an off-lattice protein model where fitness depends on the ability to be stable in a specified target structure. We were thus able to compare the thermodynamic properties of the true ancestral sequences with the properties of "ancestral sequences" inferred by maximum parsimony, maximum likelihood, and Bayesian methods. Surprisingly, we found that methods such as maximum parsimony and maximum likelihood that reconstruct a "best guess" amino acid at each position overestimate thermostability, while a Bayesian method that sometimes chooses less-probable residues from the posterior probability distribution does not. Maximum likelihood and maximum parsimony apparently tend to eliminate variants at a position that are slightly detrimental to structural stability simply because such detrimental variants are less frequent. Other properties of ancestral proteins might be similarly overestimated. This suggests that ancestral reconstruction studies require greater care to come to credible conclusions regarding functional evolution. Inferred functional patterns that mimic reconstruction bias should be reevaluated.

  1. Structure-aware Bayesian compressive sensing for frequency-hopping spectrum estimation

    Science.gov (United States)

    Liu, Shengheng; Zhang, Yimin D.; Shan, Tao; Qin, Si; Amin, Moeness G.

    2016-05-01

    Frequency-hopping (FH) is one of the commonly used spread spectrum techniques that finds wide applications in communications and radar systems due to its capability of low probability of intercept, reduced interference, and desirable ambiguity property. In this paper, we consider the blind estimation of the instantaneous FH spectrum without the knowledge of hopping patterns. The FH signals are analyzed in the joint time-frequency domain, where FH signals manifest themselves as sparse entries, thus inviting compressive sensing and sparse reconstruction techniques for FH spectrum estimation. In particular, the signals' piecewise-constant frequency characteristics are exploited in the reconstruction of sparse quadratic time-frequency representations. The Bayesian compressive sensing methods are applied to provide high-resolution frequency estimation. The FH spectrum characteristics are used in the design of signal-dependent kernel within the framework of structure-aware sparse reconstruction.

  2. Bayesian networks for management of industrial risk

    International Nuclear Information System (INIS)

    Munteanu, P.; Debache, G.; Duval, C.

    2008-01-01

    This article presents the outlines of Bayesian networks modelling and argues for their interest in the probabilistic studies of industrial risk and reliability. A practical case representative of this type of study is presented in support of the argumentation. The article concludes on some research tracks aiming at improving the performances of the methods relying on Bayesian networks and at widening their application area in risk management. (authors)

  3. MCMC for parameters estimation by bayesian approach

    International Nuclear Information System (INIS)

    Ait Saadi, H.; Ykhlef, F.; Guessoum, A.

    2011-01-01

    This article discusses the parameter estimation for dynamic system by a Bayesian approach associated with Markov Chain Monte Carlo methods (MCMC). The MCMC methods are powerful for approximating complex integrals, simulating joint distributions, and the estimation of marginal posterior distributions, or posterior means. The MetropolisHastings algorithm has been widely used in Bayesian inference to approximate posterior densities. Calibrating the proposal distribution is one of the main issues of MCMC simulation in order to accelerate the convergence.

  4. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross- entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  5. Capturing Business Cycles from a Bayesian Viewpoint

    OpenAIRE

    大鋸, 崇

    2011-01-01

    This paper is a survey of empirical studies analyzing business cycles from the perspective of Bayesian econometrics. Kim and Nelson (1998) use a hybrid model; Dynamic factor model of Stock and Watson (1989) and Markov switching model of Hamilton (1989). From the point of view, it is more important dealing with non-linear and non-Gaussian econometric models, recently. Although the classical econometric approaches have difficulty in these models, the Bayesian's do easily. The fact leads heavy u...

  6. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    inference 2.2.1 Background There are a number of statistical inference problems that are not generally formulated via a full probability model...problem of inference about an unknown parameter, the Bayesian approach requires a full probability 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND...the problem of inference about an unknown parameter, the Bayesian approach requires a full probability model/likelihood which can be an obstacle

  7. A Bayesian classifier for symbol recognition

    OpenAIRE

    Barrat , Sabine; Tabbone , Salvatore; Nourrissier , Patrick

    2007-01-01

    URL : http://www.buyans.com/POL/UploadedFile/134_9977.pdf; International audience; We present in this paper an original adaptation of Bayesian networks to symbol recognition problem. More precisely, a descriptor combination method, which enables to improve significantly the recognition rate compared to the recognition rates obtained by each descriptor, is presented. In this perspective, we use a simple Bayesian classifier, called naive Bayes. In fact, probabilistic graphical models, more spec...

  8. Bayesian Inference of Tumor Hypoxia

    Science.gov (United States)

    Gunawan, R.; Tenti, G.; Sivaloganathan, S.

    2009-12-01

    Tumor hypoxia is a state of oxygen deprivation in tumors. It has been associated with aggressive tumor phenotypes and with increased resistance to conventional cancer therapies. In this study, we report on the application of Bayesian sequential analysis in estimating the most probable value of tumor hypoxia quantification based on immunohistochemical assays of a biomarker. The `gold standard' of tumor hypoxia assessment is a direct measurement of pO2 in vivo by the Eppendorf polarographic electrode, which is an invasive technique restricted to accessible sites and living tissues. An attractive alternative is immunohistochemical staining to detect proteins expressed by cells during hypoxia. Carbonic anhydrase IX (CAIX) is an enzyme expressed on the cell membrane during hypoxia to balance the immediate extracellular microenvironment. CAIX is widely regarded as a surrogate marker of chronic hypoxia in various cancers. The study was conducted with two different experimental procedures. The first data set was a group of three patients with invasive cervical carcinomas, from which five biopsies were obtained. Each of the biopsies was fully sectioned and from each section, the proportion of CAIX-positive cells was estimated. Measurements were made by image analysis of multiple deep sections cut through these biopsies, labeled for CAIX using both immunofluorescence and immunohistochemical techniques [1]. The second data set was a group of 24 patients, also with invasive cervical carcinomas, from which two biopsies were obtained. Bayesian parameter estimation was applied to obtain a reliable inference about the proportion of CAIX-positive cells within the carcinomas, based on the available biopsies. From the first data set, two to three biopsies were found to be sufficient to infer the overall CAIX percentage in the simple form: best estimate±uncertainty. The second data-set led to a similar result in 70% of the cases. In the remaining cases Bayes' theorem warned us

  9. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.

  10. Philosophy and the practice of Bayesian statistics

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2015-01-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575

  11. Inferring the history of population size change from genome-wide SNP data.

    Science.gov (United States)

    Theunert, Christoph; Tang, Kun; Lachmann, Michael; Hu, Sile; Stoneking, Mark

    2012-12-01

    Dense, genome-wide single-nucleotide polymorphism (SNP) data can be used to reconstruct the demographic history of human populations. However, demographic inferences from such data are complicated by recombination and ascertainment bias. We introduce two new statistics, allele frequency-identity by descent (AF-IBD) and allele frequency-identity by state (AF-IBS), that make use of linkage disequilibrium information and show defined relationships to the time of coalescence. These statistics, when conditioned on the derived allele frequency, are able to infer complex population size changes. Moreover, the AF-IBS statistic, which is based on genome-wide SNP data, is robust to varying ascertainment conditions. We constructed an efficient approximate Bayesian computation (ABC) pipeline based on AF-IBD and AF-IBS that can accurately estimate demographic parameters, even for fairly complex models. Finally, we applied this ABC approach to genome-wide SNP data and inferred the demographic histories of two human populations, Yoruba and French. Our results suggest a rather stable ancestral population size with a mild recent expansion for Yoruba, whereas the French seemingly experienced a long-lasting severe bottleneck followed by a drastic population growth. This approach should prove useful for new insights into populations, especially those with complex demographic histories.

  12. Bayesian inference of synaptic quantal parameters from correlated vesicle release

    Directory of Open Access Journals (Sweden)

    Alexander D Bird

    2016-11-01

    Full Text Available Synaptic transmission is both history-dependent and stochastic, resulting in varying responses to presentations of the same presynaptic stimulus. This complicates attempts to infer synaptic parameters and has led to the proposal of a number of different strategies for their quantification. Recently Bayesian approaches have been applied to make more efficient use of the data collected in paired intracellular recordings. Methods have been developed that either provide a complete model of the distribution of amplitudes for isolated responses or approximate the amplitude distributions of a train of post-synaptic potentials, with correct short-term synaptic dynamics but neglecting correlations. In both cases the methods provided significantly improved inference of model parameters as compared to existing mean-variance fitting approaches. However, for synapses with high release probability, low vesicle number or relatively low restock rate and for data in which only one or few repeats of the same pattern are available, correlations between serial events can allow for the extraction of significantly more information from experiment: a more complete Bayesian approach would take this into account also. This has not been possible previously because of the technical difficulty in calculating the likelihood of amplitudes seen in correlated post-synaptic potential trains; however, recent theoretical advances have now rendered the likelihood calculation tractable for a broad class of synaptic dynamics models. Here we present a compact mathematical form for the likelihood in terms of a matrix product and demonstrate how marginals of the posterior provide information on covariance of parameter distributions. The associated computer code for Bayesian parameter inference for a variety of models of synaptic dynamics is provided in the supplementary material allowing for quantal and dynamical parameters to be readily inferred from experimental data sets.

  13. ABCtoolbox: a versatile toolkit for approximate Bayesian computations

    Directory of Open Access Journals (Sweden)

    Neuenschwander Samuel

    2010-03-01

    Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

  14. A Bayesian Algorithm for Assessing Uncertainty in Radionuclide Source Terms

    Science.gov (United States)

    Robins, Peter

    2015-04-01

    Inferring source term parameters for a radionuclide release is difficult, due to the large uncertainties in forward dispersion modelling as a consequence of imperfect knowledge pertaining to wind vector fields and turbulent diffusion in the Earth's atmosphere. Additional sources of error include the radionuclide measurements obtained from sensors. These measurements may either be subject to random fluctuations or are simple indications that the true, unobserved quantity is below a detection limit. Consequent large reconstruction uncertainties can render a "best" estimate meaningless. A Markov Chain Monte Carlo (MCMC) Bayesian Algorithm is presented that attempts to account for uncertainties in atmospheric transport modelling and radionuclide sensor measurements to quantify uncertainties in radionuclide release source term parameters. Prior probability distributions are created for likely release locations at existing nuclear facilities and seismic events. Likelihood models are constructed using CTBTO adjoint modelling output and probability distributions of sensor response. Samples from the resulting multi-isotope source term parameters posterior probability distribution are generated that can be used to make probabilistic statements about the source term. Examples are given of marginal probability distributions obtained from simulated sensor data. The consequences of errors in numerical weather prediction wind fields are demonstrated with a reconstruction of the Fukushima nuclear reactor accident from International Monitoring System radionuclide particulate sensor data.

  15. Aggregate Measures of Watershed Health from Reconstructed ...

    Science.gov (United States)

    Risk-based indices such as reliability, resilience, and vulnerability (R-R-V), have the potential to serve as watershed health assessment tools. Recent research has demonstrated the applicability of such indices for water quality (WQ) constituents such as total suspended solids and nutrients on an individual basis. However, the calculations can become tedious when time-series data for several WQ constituents have to be evaluated individually. Also, comparisons between locations with different sets of constituent data can prove difficult. In this study, data reconstruction using relevance vector machine algorithm was combined with dimensionality reduction via variational Bayesian noisy principal component analysis to reconstruct and condense sparse multidimensional WQ data sets into a single time series. The methodology allows incorporation of uncertainty in both the reconstruction and dimensionality-reduction steps. The R-R-V values were calculated using the aggregate time series at multiple locations within two Indiana watersheds. Results showed that uncertainty present in the reconstructed WQ data set propagates to the aggregate time series and subsequently to the aggregate R-R-V values as well. serving as motivating examples. Locations with different WQ constituents and different standards for impairment were successfully combined to provide aggregate measures of R-R-V values. Comparisons with individual constituent R-R-V values showed that v

  16. EXONEST: The Bayesian Exoplanetary Explorer

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2017-10-01

    Full Text Available The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects.

  17. Maximum entropy and Bayesian methods

    International Nuclear Information System (INIS)

    Smith, C.R.; Erickson, G.J.; Neudorfer, P.O.

    1992-01-01

    Bayesian probability theory and Maximum Entropy methods are at the core of a new view of scientific inference. These 'new' ideas, along with the revolution in computational methods afforded by modern computers allow astronomers, electrical engineers, image processors of any type, NMR chemists and physicists, and anyone at all who has to deal with incomplete and noisy data, to take advantage of methods that, in the past, have been applied only in some areas of theoretical physics. The title workshops have been the focus of a group of researchers from many different fields, and this diversity is evident in this book. There are tutorial and theoretical papers, and applications in a very wide variety of fields. Almost any instance of dealing with incomplete and noisy data can be usefully treated by these methods, and many areas of theoretical research are being enhanced by the thoughtful application of Bayes' theorem. Contributions contained in this volume present a state-of-the-art overview that will be influential and useful for many years to come

  18. Reconstructing the Alcatraz escape

    Science.gov (United States)

    Baart, F.; Hoes, O.; Hut, R.; Donchyts, G.; van Leeuwen, E.

    2014-12-01

    In the night of June 12, 1962 three inmates used a raft made of raincoatsto escaped the ultimate maximum security prison island Alcatraz in SanFrancisco, United States. History is unclear about what happened tothe escapees. At what time did they step into the water, did theysurvive, if so, where did they reach land? The fate of the escapees has been the subject of much debate: did theymake landfall on Angel Island, or did the current sweep them out ofthe bay and into the cold pacific ocean? In this presentation, we try to shed light on this historic case using avisualization of a high-resolution hydrodynamic simulation of the San Francisco Bay, combined with historical tidal records. By reconstructing the hydrodynamic conditions and using a particle based simulation of the escapees we show possible scenarios. The interactive model is visualized using both a 3D photorealistic and web based visualization. The "Escape from Alcatraz" scenario demonstrates the capabilities of the 3Di platform. This platform is normally used for overland flooding (1D/2D). The model engine uses a quad tree structure, resulting in an order of magnitude speedup. The subgrid approach takes detailed bathymetry information into account. The inter-model variability is tested by comparing the results with the DFlow Flexible Mesh (DFlowFM) San Francisco Bay model. Interactivity is implemented by converting the models from static programs to interactive libraries, adhering to the Basic ModelInterface (BMI). Interactive models are more suitable for answeringexploratory research questions such as this reconstruction effort. Although these hydrodynamic simulations only provide circumstantialevidence for solving the mystery of what happened during the foggy darknight of June 12, 1962, it can be used as a guidance and provides aninteresting testcase to apply interactive modelling.

  19. Bayesian soft x-ray tomography and MHD mode analysis on HL-2A

    Science.gov (United States)

    Li, Dong; Liu, Yi; Svensson, J.; Liu, Y. Q.; Song, X. M.; Yu, L. M.; Mao, Rui; Fu, B. Z.; Deng, Wei; Yuan, B. S.; Ji, X. Q.; Xu, Yuan; Chen, Wei; Zhou, Yan; Yang, Q. W.; Duan, X. R.; Liu, Yong; HL-2A Team

    2016-03-01

    A Bayesian based tomography method using so-called Gaussian processes (GPs) for the emission model has been applied to the soft x-ray (SXR) diagnostics on HL-2A tokamak. To improve the accuracy of reconstructions, the standard GP is extended to a non-stationary version so that different smoothness between the plasma center and the edge can be taken into account in the algorithm. The uncertainty in the reconstruction arising from measurement errors and incapability can be fully analyzed by the usage of Bayesian probability theory. In this work, the SXR reconstructions by this non-stationary Gaussian processes tomography (NSGPT) method have been compared with the equilibrium magnetic flux surfaces, generally achieving a satisfactory agreement in terms of both shape and position. In addition, singular-value-decomposition (SVD) and Fast Fourier Transform (FFT) techniques have been applied for the analysis of SXR and magnetic diagnostics, in order to explore the spatial and temporal features of the saturated long-lived magnetohydrodynamics (MHD) instability induced by energetic particles during neutral beam injection (NBI) on HL-2A. The result shows that this ideal internal kink instability has a dominant m/n  =  1/1 mode structure along with a harmonics m/n  =  2/2, which are coupled near the q  =  1 surface with a rotation frequency of 12 kHz.

  20. Holocene flooding history of the Lower Tagus Valley (Portugal)

    NARCIS (Netherlands)

    Vis, G.-J.; Bohncke, S.J.P.; Schneider, H.; Kasse, C.; Coenraads-Nederveen, S.; Zuurbier, K.; Rozema, J.

    2010-01-01

    The present paper aims to reconstruct the Lower Tagus Valley flooding history for the last ca. 6500 a, to explore the suitability of pollen-based local vegetation development in supporting the reconstruction of flooding history, and to explain fluvial activity changes in terms of allogenic (climate,

  1. Authenticating History With Oral Narratives: The Example of Ekajuk ...

    African Journals Online (AJOL)

    It is generally accepted that oral narratives serve as a veritable means for historical reconstruction. This holds true, particularly in societies where written documents do not subsist. The Ekajuk community, though very warlike, is a relatively small community that lacks a written history. The attempt to reconstruct the history of ...

  2. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-09-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  3. Dimensionality reduction in Bayesian estimation algorithms

    Science.gov (United States)

    Petty, G. W.

    2013-09-01

    An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M) of pseudochannels while also regularizing the background (geophysical plus instrument) noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals - whether Bayesian or not - lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  4. Classifying emotion in Twitter using Bayesian network

    Science.gov (United States)

    Surya Asriadie, Muhammad; Syahrul Mubarok, Mohamad; Adiwijaya

    2018-03-01

    Language is used to express not only facts, but also emotions. Emotions are noticeable from behavior up to the social media statuses written by a person. Analysis of emotions in a text is done in a variety of media such as Twitter. This paper studies classification of emotions on twitter using Bayesian network because of its ability to model uncertainty and relationships between features. The result is two models based on Bayesian network which are Full Bayesian Network (FBN) and Bayesian Network with Mood Indicator (BNM). FBN is a massive Bayesian network where each word is treated as a node. The study shows the method used to train FBN is not very effective to create the best model and performs worse compared to Naive Bayes. F1-score for FBN is 53.71%, while for Naive Bayes is 54.07%. BNM is proposed as an alternative method which is based on the improvement of Multinomial Naive Bayes and has much lower computational complexity compared to FBN. Even though it’s not better compared to FBN, the resulting model successfully improves the performance of Multinomial Naive Bayes. F1-Score for Multinomial Naive Bayes model is 51.49%, while for BNM is 52.14%.

  5. How few? Bayesian statistics in injury biomechanics.

    Science.gov (United States)

    Cutcliffe, Hattie C; Schmidt, Allison L; Lucas, Joseph E; Bass, Cameron R

    2012-10-01

    In injury biomechanics, there are currently no general a priori estimates of how few specimens are necessary to obtain sufficiently accurate injury risk curves for a given underlying distribution. Further, several methods are available for constructing these curves, and recent methods include Bayesian survival analysis. This study used statistical simulations to evaluate the fidelity of different injury risk methods using limited sample sizes across four different underlying distributions. Five risk curve techniques were evaluated, including Bayesian techniques. For the Bayesian analyses, various prior distributions were assessed, each incorporating more accurate information. Simulated subject injury and biomechanical input values were randomly sampled from each underlying distribution, and injury status was determined by comparing these values. Injury risk curves were developed for this data using each technique for various small sample sizes; for each, analyses on 2000 simulated data sets were performed. Resulting median predicted risk values and confidence intervals were compared with the underlying distributions. Across conditions, the standard and Bayesian survival analyses better represented the underlying distributions included in this study, especially for extreme (1, 10, and 90%) risk. This study demonstrates that the value of the Bayesian analysis is the use of informed priors. As the mean of the prior approaches the actual value, the sample size necessary for good reproduction of the underlying distribution with small confidence intervals can be as small as 2. This study provides estimates of confidence intervals and number of samples to allow the selection of the most appropriate sample sizes given known information.

  6. A default Bayesian hypothesis test for mediation.

    Science.gov (United States)

    Nuijten, Michèle B; Wetzels, Ruud; Matzke, Dora; Dolan, Conor V; Wagenmakers, Eric-Jan

    2015-03-01

    In order to quantify the relationship between multiple variables, researchers often carry out a mediation analysis. In such an analysis, a mediator (e.g., knowledge of a healthy diet) transmits the effect from an independent variable (e.g., classroom instruction on a healthy diet) to a dependent variable (e.g., consumption of fruits and vegetables). Almost all mediation analyses in psychology use frequentist estimation and hypothesis-testing techniques. A recent exception is Yuan and MacKinnon (Psychological Methods, 14, 301-322, 2009), who outlined a Bayesian parameter estimation procedure for mediation analysis. Here we complete the Bayesian alternative to frequentist mediation analysis by specifying a default Bayesian hypothesis test based on the Jeffreys-Zellner-Siow approach. We further extend this default Bayesian test by allowing a comparison to directional or one-sided alternatives, using Markov chain Monte Carlo techniques implemented in JAGS. All Bayesian tests are implemented in the R package BayesMed (Nuijten, Wetzels, Matzke, Dolan, & Wagenmakers, 2014).

  7. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  8. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    Science.gov (United States)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  9. Bayesians versus frequentists a philosophical debate on statistical reasoning

    CERN Document Server

    Vallverdú, Jordi

    2016-01-01

    This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept introduced in the book’s final section. This monograph will be of interest to philosophers and historians of science and students in related fields. Despite the mathematical nature of the topic, no statistical background is required, making the book a valuable read for anyone interested in the history of statistics and human cognition.

  10. A Sparse Bayesian Imaging Technique for Efficient Recovery of Reservoir Channels With Time-Lapse Seismic Measurements

    KAUST Repository

    Sana, Furrukh

    2016-06-01

    Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\\\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.

  11. Re-telling, Re-evaluating and Re-constructing

    Directory of Open Access Journals (Sweden)

    Gorana Tolja

    2013-11-01

    Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.

  12. Bayesian analysis of MEG visual evoked responses

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  13. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  14. Bayesian modeling of unknown diseases for biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2009-11-14

    This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.

  15. Breast reconstruction after mastectomy

    Directory of Open Access Journals (Sweden)

    Daniel eSchmauss

    2016-01-01

    Full Text Available Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays breast reconstruction should be individualized at its best, first of all taking into consideration oncological aspects of the tumor, neo-/adjuvant treatment and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction, as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue, the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction.

  16. Head and face reconstruction

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/002980.htm Head and face reconstruction To use the sharing features on this page, please enable JavaScript. Head and face reconstruction is surgery to repair or ...

  17. Breast reconstruction - natural tissue

    Science.gov (United States)

    ... After a mastectomy , some women choose to have cosmetic surgery to remake their breast. This type of surgery ... cancer - breast reconstruction with natural tissue Patient Instructions Cosmetic breast surgery - discharge Mastectomy and breast reconstruction - what to ask ...

  18. Bayesian disease mapping: hierarchical modeling in spatial epidemiology

    National Research Council Canada - National Science Library

    Lawson, Andrew

    2013-01-01

    .... Exploring these new developments, Bayesian Disease Mapping: Hierarchical Modeling in Spatial Epidemiology, Second Edition provides an up-to-date, cohesive account of the full range of Bayesian disease mapping methods and applications...

  19. Bayesian Inference in Polling Technique: 1992 Presidential Polls.

    Science.gov (United States)

    Satake, Eiki

    1994-01-01

    Explores the potential utility of Bayesian statistical methods in determining the predictability of multiple polls. Compares Bayesian techniques to the classical statistical method employed by pollsters. Considers these questions in the context of the 1992 presidential elections. (HB)

  20. The Bayesian Approach to Association

    Science.gov (United States)

    Arora, N. S.

    2017-12-01

    The Bayesian approach to Association focuses mainly on quantifying the physics of the domain. In the case of seismic association for instance let X be the set of all significant events (above some threshold) and their attributes, such as location, time, and magnitude, Y1 be the set of detections that are caused by significant events and their attributes such as seismic phase, arrival time, amplitude etc., Y2 be the set of detections that are not caused by significant events, and finally Y be the set of observed detections We would now define the joint distribution P(X, Y1, Y2, Y) = P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2) ; where the last term simply states that Y1 and Y2 are a partitioning of Y. Given the above joint distribution the inference problem is simply to find the X, Y1, and Y2 that maximizes posterior probability P(X, Y1, Y2| Y) which reduces to maximizing P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2). In this expression P(X) captures our prior belief about event locations. P(Y1 | X) captures notions of travel time, residual error distributions as well as detection and mis-detection probabilities. While P(Y2) captures the false detection rate of our seismic network. The elegance of this approach is that all of the assumptions are stated clearly in the model for P(X), P(Y1|X) and P(Y2). The implementation of the inference is merely a by-product of this model. In contrast some of the other methods such as GA hide a number of assumptions in the implementation details of the inference - such as the so called "driver cells." The other important aspect of this approach is that all seismic knowledge including knowledge from other domains such as infrasound and hydroacoustic can be included in the same model. So, we don't need to separately account for misdetections or merge seismic and infrasound events as a separate step. Finally, it should be noted that the objective of automatic association is to simplify the job of humans who are publishing seismic bulletins based on this

  1. Image reconstruction using neutrongraphy

    International Nuclear Information System (INIS)

    Crispim, V.R.; Lopes, R.T.; Borges, J.C.

    1986-01-01

    Many factors influence the projections determination in the process of image reconstruction utilizing neutrongraphy technique. In this work it was used the Wiener filter in the projections obtained from one object, in order to minimize the effect of the factors in the quality of the imagem reconstructed. The MART (Multiplicative - Algebraic Reconstruction Technique) algorithim was used. Qualitative and quantitative comparison were done with the original images and the one reconstructed using MART algotithim with and without filter. (Author) [pt

  2. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Vedel Jensen, Eva B.

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...... from the posterior, which contains information about the unobserved Voronoi tessellation and the model parameters. A major element of the MCMC algorithm is the reconstruction of the Voronoi tessellation after a proposed local change of the tessellation. A simulation study and examples of applications...

  3. Bayesian analysis of spatial point processes in the neighbourhood of Voronoi networks

    DEFF Research Database (Denmark)

    Skare, Øivind; Møller, Jesper; Jensen, Eva Bjørn Vedel

    2007-01-01

    A model for an inhomogeneous Poisson process with high intensity near the edges of a Voronoi tessellation in 2D or 3D is proposed. The model is analysed in a Bayesian setting with priors on nuclei of the Voronoi tessellation and other model parameters. An MCMC algorithm is constructed to sample...... from the posterior, which contains information about the unobserved Voronoi tessellation and the model parameters. A major element of the MCMC algorithm is the reconstruction of the Voronoi tessellation after a proposed local change of the tessellation. A simulation study and examples of applications...

  4. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  5. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    an infinite mixture model as running example, we go through the steps of deriving the model as an infinite limit of a finite parametric model, inferring the model parameters by Markov chain Monte Carlo, and checking the model?s fit and predictive performance. We explain how advanced nonparametric models......Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...

  6. Motion Learning Based on Bayesian Program Learning

    Directory of Open Access Journals (Sweden)

    Cheng Meng-Zhen

    2017-01-01

    Full Text Available The concept of virtual human has been highly anticipated since the 1980s. By using computer technology, Human motion simulation could generate authentic visual effect, which could cheat human eyes visually. Bayesian Program Learning train one or few motion data, generate new motion data by decomposing and combining. And the generated motion will be more realistic and natural than the traditional one.In this paper, Motion learning based on Bayesian program learning allows us to quickly generate new motion data, reduce workload, improve work efficiency, reduce the cost of motion capture, and improve the reusability of data.

  7. Bayesian inference and the parametric bootstrap

    Science.gov (United States)

    Efron, Bradley

    2013-01-01

    The parametric bootstrap can be used for the efficient computation of Bayes posterior distributions. Importance sampling formulas take on an easy form relating to the deviance in exponential families, and are particularly simple starting from Jeffreys invariant prior. Because of the i.i.d. nature of bootstrap sampling, familiar formulas describe the computational accuracy of the Bayes estimates. Besides computational methods, the theory provides a connection between Bayesian and frequentist analysis. Efficient algorithms for the frequentist accuracy of Bayesian inferences are developed and demonstrated in a model selection example. PMID:23843930

  8. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  9. Length Scales in Bayesian Automatic Adaptive Quadrature

    Science.gov (United States)

    Adam, Gh.; Adam, S.

    2016-02-01

    Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1-16 (2012)] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule), mesoscopic (Simpson rule), and macroscopic (quadrature sums of high algebraic degrees of precision). Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  10. Modelling access to renal transplantation waiting list in a French healthcare network using a Bayesian method.

    Science.gov (United States)

    Bayat, Sahar; Cuggia, Marc; Kessler, Michel; Briançon, Serge; Le Beux, Pierre; Frimat, Luc

    2008-01-01

    Evaluation of adult candidates for kidney transplantation diverges from one centre to another. Our purpose was to assess the suitability of Bayesian method for describing the factors associated to registration on the waiting list in a French healthcare network. We have found no published paper using Bayesian method in this domain. Eight hundred and nine patients starting renal replacement therapy were included in the analysis. The data were extracted from the information system of the healthcare network. We performed conventional statistical analysis and data mining analysis using mainly Bayesian networks. The Bayesian model showed that the probability of registration on the waiting list is associated to age, cardiovascular disease, diabetes, serum albumin level, respiratory disease, physical impairment, follow-up in the department performing transplantation and past history of malignancy. These results are similar to conventional statistical method. The comparison between conventional analysis and data mining analysis showed us the contribution of the data mining method for sorting variables and having a global view of the variables' associations. Moreover theses approaches constitute an essential step toward a decisional information system for healthcare networks.

  11. Accelerated median root prior reconstruction for pinhole single-photon emission tomography (SPET)

    Energy Technology Data Exchange (ETDEWEB)

    Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, PO Box 553 FIN-33101, Tampere (Finland); Watabe, Hiroshi [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Iida, Hidehiro [National Cardiovascular Center Research Institute, 5-7-1 Fujisihro-dai, Suita City, Osaka 565-8565 (Japan); Kuikka, Jyrki T [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, PO Box 1777 FIN-70211, Kuopio (Finland)

    2003-07-07

    Pinhole collimation can be used to improve spatial resolution in SPET. However, the resolution improvement is achieved at the cost of reduced sensitivity, which leads to projection images with poor statistics. Images reconstructed from these projections using the maximum likelihood expectation maximization (ML-EM) algorithms, which have been used to reduce the artefacts generated by the filtered backprojection (FBP) based reconstruction, suffer from noise/bias trade-off: noise contaminates the images at high iteration numbers, whereas early abortion of the algorithm produces images that are excessively smooth and biased towards the initial estimate of the algorithm. To limit the noise accumulation we propose the use of the pinhole median root prior (PH-MRP) reconstruction algorithm. MRP is a Bayesian reconstruction method that has already been used in PET imaging and shown to possess good noise reduction and edge preservation properties. In this study the PH-MRP algorithm was accelerated with the ordered subsets (OS) procedure and compared to the FBP, OS-EM and conventional Bayesian reconstruction methods in terms of noise reduction, quantitative accuracy, edge preservation and visual quality. The results showed that the accelerated PH-MRP algorithm was very robust. It provided visually pleasing images with lower noise level than the FBP or OS-EM and with smaller bias and sharper edges than the conventional Bayesian methods.

  12. Ancestral sequence reconstruction in primate mitochondrial DNA: compositional bias and effect on functional inference.

    Science.gov (United States)

    Krishnan, Neeraja M; Seligmann, Hervé; Stewart, Caro-Beth; De Koning, A P Jason; Pollock, David D

    2004-10-01

    Reconstruction of ancestral DNA and amino acid sequences is an important means of inferring information about past evolutionary events. Such reconstructions suggest changes in molecular function and evolutionary processes over the course of evolution and are used to infer adaptation and convergence. Maximum likelihood (ML) is generally thought to provide relatively accurate reconstructed sequences compared to parsimony, but both methods lead to the inference of multiple directional changes in nucleotide frequencies in primate mitochondrial DNA (mtDNA). To better understand this surprising result, as well as to better understand how parsimony and ML differ, we constructed a series of computationally simple "conditional pathway" methods that differed in the number of substitutions allowed per site along each branch, and we also evaluated the entire Bayesian posterior frequency distribution of reconstructed ancestral states. We analyzed primate mitochondrial cytochrome b (Cyt-b) and cytochrome oxidase subunit I (COI) genes and found that ML reconstructs ancestral frequencies that are often more different from tip sequences than are parsimony reconstructions. In contrast, frequency reconstructions based on the posterior ensemble more closely resemble extant nucleotide frequencies. Simulations indicate that these differences in ancestral sequence inference are probably due to deterministic bias caused by high uncertainty in the optimization-based ancestral reconstruction methods (parsimony, ML, Bayesian maximum a posteriori). In contrast, ancestral nucleotide frequencies based on an average of the Bayesian set of credible ancestral sequences are much less biased. The methods involving simpler conditional pathway calculations have slightly reduced likelihood values compared to full likelihood calculations, but they can provide fairly unbiased nucleotide reconstructions and may be useful in more complex phylogenetic analyses than considered here due to their speed and

  13. Accurate phylogenetic tree reconstruction from quartets: a heuristic approach.

    Science.gov (United States)

    Reaz, Rezwana; Bayzid, Md Shamsuzzoha; Rahman, M Sohel

    2014-01-01

    Supertree methods construct trees on a set of taxa (species) combining many smaller trees on the overlapping subsets of the entire set of taxa. A 'quartet' is an unrooted tree over 4 taxa, hence the quartet-based supertree methods combine many 4-taxon unrooted trees into a single and coherent tree over the complete set of taxa. Quartet-based phylogeny reconstruction methods have been receiving considerable attentions in the recent years. An accurate and efficient quartet-based method might be competitive with the current best phylogenetic tree reconstruction methods (such as maximum likelihood or Bayesian MCMC analyses), without being as computationally intensive. In this paper, we present a novel and highly accurate quartet-based phylogenetic tree reconstruction method. We performed an extensive experimental study to evaluate the accuracy and scalability of our approach on both simulated and biological datasets.

  14. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  15. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens|info:eu-repo/dai/nl/304833207; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G|info:eu-repo/dai/nl/081831218

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  16. A gentle introduction to Bayesian analysis : Applications to developmental research

    NARCIS (Netherlands)

    van de Schoot, R.; Kaplan, D.; Denissen, J.J.A.; Asendorpf, J.B.; Neyer, F.J.; van Aken, M.A.G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First,

  17. Histories electromagnetism

    International Nuclear Information System (INIS)

    Burch, Aidan

    2004-01-01

    Working within the HPO (History Projection Operator) Consistent Histories formalism, we follow the work of Savvidou on (scalar) field theory [J. Math. Phys. 43, 3053 (2002)] and that of Savvidou and Anastopoulos on (first-class) constrained systems [Class. Quantum Gravt. 17, 2463 (2000)] to write a histories theory (both classical and quantum) of Electromagnetism. We focus particularly on the foliation-dependence of the histories phase space/Hilbert space and the action thereon of the two Poincare groups that arise in histories field theory. We quantize in the spirit of the Dirac scheme for constrained systems

  18. Further case histories

    CSIR Research Space (South Africa)

    Van Schoor, Abraham M

    2015-01-01

    Full Text Available stream_source_info van Schoor3_2015.pdf.txt stream_content_type text/plain stream_size 17046 Content-Encoding UTF-8 stream_name van Schoor3_2015.pdf.txt Content-Type text/plain; charset=UTF-8 114 5 further cAse hIstorIes... derived from the raw data. z x y (a) (b) 119 An obvious disadvantage of these technologies is that boreholes are required. Furthermore, to reconstruct an accurate representation of the bords and pillars in an area, one needs several strategically placed...

  19. Morphological homoplasy, life history evolution, and historical biogeography of plethodontid salamanders inferred from complete mitochondrial genomes

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Rachel Lockridge; Macey, J. Robert; Jaekel, Martin; Wake, David B.; Boore, Jeffrey L.

    2004-08-01

    The evolutionary history of the largest salamander family (Plethodontidae) is characterized by extreme morphological homoplasy. Analysis of the mechanisms generating such homoplasy requires an independent, molecular phylogeny. To this end, we sequenced 24 complete mitochondrial genomes (22 plethodontids and two outgroup taxa), added data for three species from GenBank, and performed partitioned and unpartitioned Bayesian, ML, and MP phylogenetic analyses. We explored four dataset partitioning strategies to account for evolutionary process heterogeneity among genes and codon positions, all of which yielded increased model likelihoods and decreased numbers of supported nodes in the topologies (PP > 0.95) relative to the unpartitioned analysis. Our phylogenetic analyses yielded congruent trees that contrast with the traditional morphology-based taxonomy; the monophyly of three out of four major groups is rejected. Reanalysis of current hypotheses in light of these new evolutionary relationships suggests that (1) a larval life history stage re-evolved from a direct-developing ancestor multiple times, (2) there is no phylogenetic support for the ''Out of Appalachia'' hypothesis of plethodontid origins, and (3) novel scenarios must be reconstructed for the convergent evolution of projectile tongues, reduction in toe number, and specialization for defensive tail loss. Some of these novel scenarios imply morphological transformation series that proceed in the opposite direction than was previously thought. In addition, they suggest surprising evolutionary lability in traits previously interpreted to be conservative.

  20. High migration rates shape the postglacial history of amphi-Atlantic bryophytes.

    Science.gov (United States)

    Désamoré, Aurélie; Patiño, Jairo; Mardulyn, Patrick; Mcdaniel, Stuart F; Zanatta, Florian; Laenen, Benjamin; Vanderpoorten, Alain

    2016-11-01

    Paleontological evidence and current patterns of angiosperm species richness suggest that European biota experienced more severe bottlenecks than North American ones during the last glacial maximum. How well this pattern fits other plant species is less clear. Bryophytes offer a unique opportunity to contrast the impact of the last glacial maximum in North America and Europe because about 60% of the European bryoflora is shared with North America. Here, we use population genetic analyses based on approximate Bayesian computation on eight amphi-Atlantic species to test the hypothesis that North American populations were less impacted by the last glacial maximum, exhibiting higher levels of genetic diversity than European ones and ultimately serving as a refugium for the postglacial recolonization of Europe. In contrast with this hypothesis, the best-fit demographic model involved similar patterns of population size contractions, comparable levels of genetic diversity and balanced migration rates between European and North American populations. Our results thus suggest that bryophytes have experienced comparable demographic glacial histories on both sides of the Atlantic. Although a weak, but significant genetic structure was systematically recovered between European and North American populations, evidence for migration from and towards both continents suggests that amphi-Atlantic bryophyte population may function as a metapopulation network. Reconstructing the biogeographic history of either North American or European bryophyte populations therefore requires a large, trans-Atlantic geographic framework. © 2016 John Wiley & Sons Ltd.

  1. The genetic diversity and evolutionary history of hepatitis C virus in Vietnam.

    Science.gov (United States)

    Li, Chunhua; Yuan, Manqiong; Lu, Ling; Lu, Teng; Xia, Wenjie; Pham, Van H; Vo, An X D; Nguyen, Mindie H; Abe, Kenji

    2014-11-01

    Vietnam has a unique history in association with foreign countries, which may have resulted in multiple introductions of the alien HCV strains to mix with those indigenous ones. In this study, we characterized the HCV sequences in Core-E1 and NS5B regions from 236 Vietnamese individuals. We identified multiple HCV lineages; 6a, 6 e, 6h, 6k, 6l, 6 o, 6p, and two novel variants may represent the indigenous strains; 1a was probably introduced from the US; 1b and 2a possibly originated in East Asia; while 2i, 2j, and 2m were likely brought by French explorers. We inferred the evolutionary history for four major subtypes: 1a, 1b, 6a, and 6 e. The obtained Bayesian Skyline Plots (BSPs) consistently showed the rapid HCV population growth from 1955 to 1963 until 1984 or after, corresponding to the era of the Vietnam War. We also estimated HCV growth rates and reconstructed phylogeographic trees for comparing subtypes 1a, 1b, and HCV-2. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A Bayesian perspective on some replacement strategies

    International Nuclear Information System (INIS)

    Mazzuchi, Thomas A.; Soyer, Refik

    1996-01-01

    In this paper we present a Bayesian decision theoretic approach for determining optimal replacement strategies. This approach enables us to formally incorporate, express, and update our uncertainty when determining optimal replacement strategies. We develop relevant expressions for both the block replacement protocol with minimal repair and the age replacement protocol and illustrate the use of our approach with real data

  3. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  4. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become available. The usefulness of the method is demonstrated with ...

  5. Sequential Bayesian technique: An alternative approach for ...

    Indian Academy of Sciences (India)

    MS received 8 October 2007; revised 15 July 2008. Abstract. This paper proposes a sequential Bayesian approach similar to Kalman filter for estimating reliability growth or decay of software. The main advantage of proposed method is that it shows the variation of the parameter over a time, as new failure data become ...

  6. Theory change and Bayesian statistical inference

    NARCIS (Netherlands)

    Romeijn, Jan-Willem

    2005-01-01

    This paper addresses the problem that Bayesian statistical inference cannot accommodate theory change, and proposes a framework for dealing with such changes. It first presents a scheme for generating predictions from observations by means of hypotheses. An example shows how the hypotheses represent

  7. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose

    2013-01-01

    Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  8. Non-Linear Approximation of Bayesian Update

    KAUST Repository

    Litvinenko, Alexander

    2016-06-23

    We develop a non-linear approximation of expensive Bayesian formula. This non-linear approximation is applied directly to Polynomial Chaos Coefficients. In this way, we avoid Monte Carlo sampling and sampling error. We can show that the famous Kalman Update formula is a particular case of this update.

  9. Bayesian approach and application to operation safety

    International Nuclear Information System (INIS)

    Procaccia, H.; Suhner, M.Ch.

    2003-01-01

    The management of industrial risks requires the development of statistical and probabilistic analyses which use all the available convenient information in order to compensate the insufficient experience feedback in a domain where accidents and incidents remain too scarce to perform a classical statistical frequency analysis. The Bayesian decision approach is well adapted to this problem because it integrates both the expertise and the experience feedback. The domain of knowledge is widen, the forecasting study becomes possible and the decisions-remedial actions are strengthen thanks to risk-cost-benefit optimization analyzes. This book presents the bases of the Bayesian approach and its concrete applications in various industrial domains. After a mathematical presentation of the industrial operation safety concepts and of the Bayesian approach principles, this book treats of some of the problems that can be solved thanks to this approach: softwares reliability, controls linked with the equipments warranty, dynamical updating of databases, expertise modeling and weighting, Bayesian optimization in the domains of maintenance, quality control, tests and design of new equipments. A synthesis of the mathematical formulae used in this approach is given in conclusion. (J.S.)

  10. Comparison between Fisherian and Bayesian approach to ...

    African Journals Online (AJOL)

    ... of its simplicity and optimality properties is normally used for two group cases. However, Bayesian approach is found to be better than Fisher's approach because of its low misclassification error rate. Keywords: variance-covariance matrices, centroids, prior probability, mahalanobis distance, probability of misclassification ...

  11. Neural network classification - A Bayesian interpretation

    Science.gov (United States)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  12. Bayesian Estimation of Item Response Curves.

    Science.gov (United States)

    Tsutakawa, Robert K.; Lin, Hsin Ying

    1986-01-01

    Item response curves for a set of binary responses are studied from a Bayesian viewpoint of estimating the item parameters. For the two-parameter logistic model with normally distributed ability, restricted bivariate beta priors are used to illustrate the computation of the posterior mode via the EM algorithm. (Author/LMO)

  13. Speech Segmentation Using Bayesian Autoregressive Changepoint Detector

    Directory of Open Access Journals (Sweden)

    P. Sovka

    1998-12-01

    Full Text Available This submission is devoted to the study of the Bayesian autoregressive changepoint detector (BCD and its use for speech segmentation. Results of the detector application to autoregressive signals as well as to real speech are given. BCD basic properties are described and discussed. The novel two-step algorithm consisting of cepstral analysis and BCD for automatic speech segmentation is suggested.

  14. Bayesian networks: a combined tuning heuristic

    NARCIS (Netherlands)

    Bolt, J.H.

    2016-01-01

    One of the issues in tuning an output probability of a Bayesian network by changing multiple parameters is the relative amount of the individual parameter changes. In an existing heuristic parameters are tied such that their changes induce locally a maximal change of the tuned probability. This

  15. Exploiting structure in cooperative Bayesian games

    NARCIS (Netherlands)

    Oliehoek, F.A.; Whiteson, S.; Spaan, M.T.J.; de Freitas, N.; Murphy, K.

    2012-01-01

    Cooperative Bayesian games (BGs) can model decision-making problems for teams of agents under imperfect information, but require space and computation time that is exponential in the number of agents. While agent independence has been used to mitigate these problems in perfect information settings,

  16. BAYESIAN ESTIMATION OF THERMONUCLEAR REACTION RATES

    Energy Technology Data Exchange (ETDEWEB)

    Iliadis, C.; Anderson, K. S. [Department of Physics and Astronomy, University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3255 (United States); Coc, A. [Centre de Sciences Nucléaires et de Sciences de la Matière (CSNSM), CNRS/IN2P3, Univ. Paris-Sud, Université Paris–Saclay, Bâtiment 104, F-91405 Orsay Campus (France); Timmes, F. X.; Starrfield, S., E-mail: iliadis@unc.edu [School of Earth and Space Exploration, Arizona State University, Tempe, AZ 85287-1504 (United States)

    2016-11-01

    The problem of estimating non-resonant astrophysical S -factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S -factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p, γ ){sup 3}He, {sup 3}He({sup 3}He,2p){sup 4}He, and {sup 3}He( α , γ ){sup 7}Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.

  17. An Approximate Bayesian Fundamental Frequency Estimator

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2012-01-01

    Joint fundamental frequency and model order estimation is an important problem in several applications such as speech and music processing. In this paper, we develop an approximate estimation algorithm of these quantities using Bayesian inference. The inference about the fundamental frequency...

  18. Erratum Bayesian and Dempster–Shafer fusion

    Indian Academy of Sciences (India)

    (1) The paper “Bayesian and Dempster–Shafer fusion” contains a mistake in Appendix A, although this has not affected anything in the body of the paper. On page 172, the authors state correctly that the matrix F is, in general, not square, but then in (A.22) they take its determinant. This confusion resulted because the ...

  19. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...

  20. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.