Bayesian History Reconstruction of Complex Human Gene Clusters on a Phylogeny
Vinař, Tomáš; Song, Giltae; Siepel, Adam
2009-01-01
Clusters of genes that have evolved by repeated segmental duplication present difficult challenges throughout genomic analysis, from sequence assembly to functional analysis. Improved understanding of these clusters is of utmost importance, since they have been shown to be the source of evolutionary innovation, and have been linked to multiple diseases, including HIV and a variety of cancers. Previously, Zhang et al. (2008) developed an algorithm for reconstructing parsimonious evolutionary histories of such gene clusters, using only human genomic sequence data. In this paper, we propose a probabilistic model for the evolution of gene clusters on a phylogeny, and an MCMC algorithm for reconstruction of duplication histories from genomic sequences in multiple species. Several projects are underway to obtain high quality BAC-based assemblies of duplicated clusters in multiple species, and we anticipate that our method will be useful in analyzing these valuable new data sets.
Bayesian tomographic reconstruction of microsystems
Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali
2007-11-01
The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.
Bayesian image reconstruction: Application to emission tomography
Nunez, J.; Llacer, J.
1989-02-01
In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.
Bayesian Image Reconstruction Based on Voronoi Diagrams
Cabrera, G F; Hitschfeld, N
2007-01-01
We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.
Structure-based bayesian sparse reconstruction
Quadeer, Ahmed Abdul
2012-12-01
Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.
Fernandes, Ricardo; Millard, Andrew R.; Brabec, Marek; Nadeau, Marie-Josée; Grootes, Pieter
2014-01-01
Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals) was ...
Hierarchical Bayesian sparse image reconstruction with application to MRFM
Dobigeon, Nicolas; Tourneret, Jean-Yves
2008-01-01
This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g. by maximizing the estimated posterior distribution. In our fully Bayesian approach the posteriors of all the parameters are available. Thus our algorithm provides more information than other previously proposed sparse reconstr...
Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr.
2016-07-01
In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.
Fernandes, R.; Millard, A.R.; Brabec, Marek; Nadeau, M.J.; Grootes, P.
2014-01-01
Roč. 9, č. 2 (2014), Art. no. e87436. E-ISSN 1932-6203 Institutional support: RVO:67985807 Keywords : ancienit diet reconstruction * stable isotope measurements * mixture model * Bayesian estimation * Dirichlet prior Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.234, year: 2014
A Nonparametric Bayesian Approach For Emission Tomography Reconstruction
We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution--normalized emission intensity of the spatial poisson process--is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM
A Nonparametric Bayesian Approach For Emission Tomography Reconstruction
Barat, Éric; Dautremer, Thomas
2007-11-01
We introduce a PET reconstruction algorithm following a nonparametric Bayesian (NPB) approach. In contrast with Expectation Maximization (EM), the proposed technique does not rely on any space discretization. Namely, the activity distribution—normalized emission intensity of the spatial poisson process—is considered as a spatial probability density and observations are the projections of random emissions whose distribution has to be estimated. This approach is nonparametric in the sense that the quantity of interest belongs to the set of probability measures on Rk (for reconstruction in k-dimensions) and it is Bayesian in the sense that we define a prior directly on this spatial measure. In this context, we propose to model the nonparametric probability density as an infinite mixture of multivariate normal distributions. As a prior for this mixture we consider a Dirichlet Process Mixture (DPM) with a Normal-Inverse Wishart (NIW) model as base distribution of the Dirichlet Process. As in EM-family reconstruction, we use a data augmentation scheme where the set of hidden variables are the emission locations for each observed line of response in the continuous object space. Thanks to the data augmentation, we propose a Markov Chain Monte Carlo (MCMC) algorithm (Gibbs sampler) which is able to generate draws from the posterior distribution of the spatial intensity. A difference with EM is that one step of the Gibbs sampler corresponds to the generation of emission locations while only the expected number of emissions per pixel/voxel is used in EM. Another key difference is that the estimated spatial intensity is a continuous function such that there is no need to compute a projection matrix. Finally, draws from the intensity posterior distribution allow the estimation of posterior functionnals like the variance or confidence intervals. Results are presented for simulated data based on a 2D brain phantom and compared to Bayesian MAP-EM.
Bayesian 3D velocity field reconstruction with VIRBIUS
Lavaux, Guilhem
2016-03-01
I describe a new Bayesian-based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of e.g. the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIUS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and deep distance surveys for which individual errors on distance would impede velocity field inference.
Bayesian 3d velocity field reconstruction with VIRBIuS
Lavaux, G
2015-01-01
I describe a new Bayesian based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of, e.g., the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIuS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3,000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and d...
A Bayesian Method for Estimating Evolutionary History
Kim, Joungyoun; Anthony, Nicola M; Larget, Bret R.
2012-01-01
Phylogeography is the study of evolutionary history among populations in a species associated with geographic genetic variation. This paper examines the phylogeography of three African gorilla subspecies based on two types of DNA sequence data. One type is HV1, the first hyper-variable region in the control region of the mitochondrial genome. The other type is nuclear mitochondrial DNA (Numt DNA), which results from the introgression of a copy of HV1 from the mitochondrial genome into the nuc...
A new Bayesian approach to the reconstruction of spectral functions
Burnier, Yannis
2013-01-01
We present a novel approach for the reconstruction of spectra from Euclidean correlator data that makes close contact to modern Bayesian concepts. It is based upon an axiomatically justified dimensionless prior distribution, which in the case of constant prior function $m(\\omega)$ only imprints smoothness on the reconstructed spectrum. In addition we are able to analytically integrate out the only relevant overall hyper-parameter $\\alpha$ in the prior, removing the necessity for Gaussian approximations found e.g. in the Maximum Entropy Method. Using a quasi-Newton minimizer and high-precision arithmetic, we are then able to find the unique global extremum of $P[\\rho|D]$ in the full $N_\\omega\\gg N_\\tau$ dimensional search space. The method actually yields gradually improving reconstruction results if the quality of the supplied input data increases, without introducing artificial peak structures, often encountered in the MEM. To support these statements we present mock data analyses for the case of zero width ...
Reconstruction of Zeff profiles at TEXTOR through Bayesian source separation
We describe a work in progress on the reconstruction of radial profiles for the ion effective charge Zeff on the TEXTOR tokamak, using statistical data analysis techniques. We introduce our diagnostic for the measurement of Bremsstrahlung emissivity signals. Zeff profiles can be determined by Abel inversion of line-integrated measurements of the Bremsstrahlung emissivity (εff) from the plasma and the plasma electron density (ne) and temperature (Te). However, at the plasma edge only estimated values are routinely used for ne and Te, which are moreover determined at different toroidal locations. These various uncertainties hinder the interpretation of a Zeff profile outside the central plasma. In order to circumvent this problem, we propose several scenarios meant to allow the extraction by (Bayesian) Blind Source Separation techniques of either (line-integrated) Zeff wave shapes or absolutely calibrated signals from (line-integrated) emissivity signals, using also density and temperature signals, as required. (authors)
Cahill, N.; Kemp, A. C.; Horton, B. P.; Parnell, A.C.
2015-01-01
We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation,...
Cahill, Niamh; Kemp, Andrew C.; Horton, Benjamin P.; Andrew C Parnell
2016-01-01
We present a Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) a new Bayesian transfer (B-TF) function for the calibration of biological indicators into tidal elevation, which is fl...
Hominin life history: reconstruction and evolution.
Robson, Shannen L; Wood, Bernard
2008-04-01
In this review we attempt to reconstruct the evolutionary history of hominin life history from extant and fossil evidence. We utilize demographic life history theory and distinguish life history variables, traits such as weaning, age at sexual maturity, and life span, from life history-related variables such as body mass, brain growth, and dental development. The latter are either linked with, or can be used to make inferences about, life history, thus providing an opportunity for estimating life history parameters in fossil taxa. We compare the life history variables of modern great apes and identify traits that are likely to be shared by the last common ancestor of Pan-Homo and those likely to be derived in hominins. All great apes exhibit slow life histories and we infer this to be true of the last common ancestor of Pan-Homo and the stem hominin. Modern human life histories are even slower, exhibiting distinctively long post-menopausal life spans and later ages at maturity, pointing to a reduction in adult mortality since the Pan-Homo split. We suggest that lower adult mortality, distinctively short interbirth intervals, and early weaning characteristic of modern humans are derived features resulting from cooperative breeding. We evaluate the fidelity of three life history-related variables, body mass, brain growth and dental development, with the life history parameters of living great apes. We found that body mass is the best predictor of great ape life history events. Brain growth trajectories and dental development and eruption are weakly related proxies and inferences from them should be made with caution. We evaluate the evidence of life history-related variables available for extinct species and find that prior to the transitional hominins there is no evidence of any hominin taxon possessing a body size, brain size or aspects of dental development much different from what we assume to be the primitive life history pattern for the Pan-Homo clade. Data for
Efficient reconstruction of contaminant release history
Alezander, Francis [Los Alamos National Laboratory; Anghel, Marian [Los Alamos National Laboratory; Gulbahce, Natali [NON LANL; Tartakovsky, Daniel [NON LANL
2009-01-01
We present a generalized hybrid Monte Carlo (GHMC) method for fast, statistically optimal reconstruction of release histories of reactive contaminants. The approach is applicable to large-scale, strongly nonlinear systems with parametric uncertainties and data corrupted by measurement errors. The use of discrete adjoint equations facilitates numerical implementation of GHMC, without putting any restrictions on the degree of nonlinearity of advection-dispersion-reaction equations that are used to described contaminant transport in the subsurface. To demonstrate the salient features of the proposed algorithm, we identify the spatial extent of a distributed source of contamination from concentration measurements of a reactive solute.
Bayesian inference of the demographic history of chimpanzees.
Wegmann, Daniel; Excoffier, Laurent
2010-06-01
Due to an almost complete absence of fossil record, the evolutionary history of chimpanzees has only been studied recently on the basis of genetic data. Although the general topology of the chimpanzee phylogeny is well established, uncertainties remain concerning the size of current and past populations, the occurrence of bottlenecks or population expansions, or about divergence times and migrations rates between subspecies. Here, we present a novel attempt at globally inferring the detailed evolution of the Pan genus based on approximate Bayesian computation, an approach preferentially applied to complex models where the likelihood cannot be computed analytically. Based on two microsatellite and DNA sequence data sets and adjusting simulated data for local levels of inbreeding and patterns of missing data, we find support for several new features of chimpanzee evolution as compared with previous studies based on smaller data sets and simpler evolutionary models. We find that the central chimpanzees are certainly the oldest population of all P. troglodytes subspecies and that the other two P. t. subspecies diverged from the central chimpanzees by founder events. We also find an older divergence time (1.6 million years [My]) between common chimpanzee and Bonobos than previous studies (0.9-1.3 My), but this divergence appears to have been very progressive with the maintenance of relatively high levels of gene flow between the ancestral chimpanzee population and the Bonobos. Finally, we could also confirm the existence of strong unidirectional gene flow from the western into the central chimpanzee. These results show that interesting and innovative features of chimpanzee history emerge when considering their whole evolutionary history in a single analysis, rather than relying on simpler models involving several comparisons of pairs of populations. PMID:20118191
Reconstructing the history of dark energy using maximum entropy
Zunckel, C.; Trotta, R.
2007-01-01
We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS pro...
Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction
Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results
A novel reconstruction algorithm for bioluminescent tomography based on Bayesian compressive sensing
Wang, Yaqi; Feng, Jinchao; Jia, Kebin; Sun, Zhonghua; Wei, Huijun
2016-03-01
Bioluminescence tomography (BLT) is becoming a promising tool because it can resolve the biodistribution of bioluminescent reporters associated with cellular and subcellular function through several millimeters with to centimeters of tissues in vivo. However, BLT reconstruction is an ill-posed problem. By incorporating sparse a priori information about bioluminescent source, enhanced image quality is obtained for sparsity based reconstruction algorithm. Therefore, sparsity based BLT reconstruction algorithm has a great potential. Here, we proposed a novel reconstruction method based on Bayesian compressive sensing and investigated its feasibility and effectiveness with a heterogeneous phantom. The results demonstrate the potential and merits of the proposed algorithm.
Sparse reconstruction using distribution agnostic bayesian matching pursuit
Masood, Mudassir
2013-11-01
A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.
Comparing Nonparametric Bayesian Tree Priors for Clonal Reconstruction of Tumors
Deshwar, Amit G; Vembu, Shankar; Morris, Quaid
2014-01-01
Statistical machine learning methods, especially nonparametric Bayesian methods, have become increasingly popular to infer clonal population structure of tumors. Here we describe the treeCRP, an extension of the Chinese restaurant process (CRP), a popular construction used in nonparametric mixture models, to infer the phylogeny and genotype of major subclonal lineages represented in the population of cancer cells. We also propose new split-merge updates tailored to the subclonal reconstructio...
Texture-preserving Bayesian image reconstruction for low-dose CT
Zhang, Hao; Han, Hao; Hu, Yifan; Liu, Yan; Ma, Jianhua; Li, Lihong; Moore, William; Liang, Zhengrong
2016-03-01
Markov random field (MRF) model has been widely used in Bayesian image reconstruction to reconstruct piecewise smooth images in the presence of noise, such as in low-dose X-ray computed tomography (LdCT). While it can preserve edge sharpness via edge-preserving potential function, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it compromises clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodule or colon polyp. This study aims to shift the edge preserving regional noise smoothing paradigm to texture-preserving framework for LdCT image reconstruction while retaining the advantage of MRF's neighborhood system on edge preservation. Specifically, we adapted the MRF model to incorporate the image textures of lung, bone, fat, muscle, etc. from previous full-dose CT scan as a priori knowledge for texture-preserving Bayesian reconstruction of current LdCT images. To show the feasibility of proposed reconstruction framework, experiments using clinical patient scans (with lung nodule or colon polyp) were conducted. The experimental outcomes showed noticeable gain by the a priori knowledge for LdCT image reconstruction with the well-known Haralick texture measures. Thus, it is conjectured that texture-preserving LdCT reconstruction has advantages over edge-preserving regional smoothing paradigm for texture-specific clinical applications.
A novel Bayesian approach to spectral function reconstruction
Burnier, Yannis
2013-01-01
We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the MEM. We present a realistic test of our method in the context of the non-perturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. An improved potential estimation from previously investigated quenched lattice QCD correlators is provided.
Automated comparison of Bayesian reconstructions of experimental profiles with physical models
In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author)
Milestones in the History of Ear Reconstruction.
Berghaus, Alexander; Nicoló, Marion San
2015-12-01
The reconstruction of ear deformities has been challenging plastic surgeons since centuries. However, it is only in the 19th century that reports on partial and total ear reconstruction start increasing. In the quest for an aesthetically pleasing and natural-looking result, surgeons worked on the perfect framework and skin coverage. Different materials and flap techniques have evolved. Some were abandoned out of frustration, while others kept evolving over the years. In this article, we discuss the milestones in ear reconstruction-from ancient times to early attempts in Western civilization to the key chapters of ear reconstruction in the 20th century leading to the current techniques. PMID:26667630
Bayesian PET image reconstruction incorporating anato-functional joint entropy
Tang, Jing; Rahmim, Arman
2009-12-01
We developed a maximum a posterior (MAP) reconstruction method for positron emission tomography (PET) image reconstruction incorporating magnetic resonance (MR) image information, with the joint entropy between the PET and MR image features serving as the regularization constraint. A non-parametric method was used to estimate the joint probability density of the PET and MR images. Using realistically simulated PET and MR human brain phantoms, the quantitative performance of the proposed algorithm was investigated. Incorporation of the anatomic information via this technique, after parameter optimization, was seen to dramatically improve the noise versus bias tradeoff in every region of interest, compared to the result from using conventional MAP reconstruction. In particular, hot lesions in the FDG PET image, which had no anatomical correspondence in the MR image, also had improved contrast versus noise tradeoff. Corrections were made to figures 3, 4 and 6, and to the second paragraph of section 3.1 on 13 November 2009. The corrected electronic version is identical to the print version.
N. Cahill
2015-10-01
Full Text Available We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera and geochemical (δ13C sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1 A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment δ13C values, (2 A chronology developed from an existing Bchron age-depth model, and (3 An existing errors-in-variables integrated Gaussian process (EIV-IGP model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey. USA We develop a new Bayesian transfer function (B-TF, with and without the δ13C proxy and compare our results to those from a widely-used weighted-averaging transfer function (WA-TF. The formal incorporation of a second proxy into the B-TF model results in smaller vertical uncertainties and improved accuracy for reconstructed RSL. The vertical uncertainty from the multi-proxy B-TF is ∼ 28 % smaller on average compared to the WA-TF. When evaluated against historic tide-gauge measurements, the multi-proxy B-TF most accurately reconstructs the RSL changes observed in the instrumental record (MSE = 0.003 m2. The holistic model provides a single, unifying framework for reconstructing and analysing sea level through time. This approach is suitable for reconstructing other paleoenvironmental variables using biological proxies.
Cahill, Niamh; Kemp, Andrew C.; Horton, Benjamin P.; Parnell, Andrew C.
2016-02-01
We present a Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical (δ13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) a new Bayesian transfer (B-TF) function for the calibration of biological indicators into tidal elevation, which is flexible enough to formally accommodate additional proxies; (2) an existing chronology developed using the Bchron age-depth model, and (3) an existing Errors-In-Variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. Our approach is illustrated using a case study of Common Era sea-level variability from New Jersey, USA We develop a new B-TF using foraminifera, with and without the additional (δ13C) proxy and compare our results to those from a widely used weighted-averaging transfer function (WA-TF). The formal incorporation of a second proxy into the B-TF model results in smaller vertical uncertainties and improved accuracy for reconstructed RSL. The vertical uncertainty from the multi-proxy B-TF is ˜ 28 % smaller on average compared to the WA-TF. When evaluated against historic tide-gauge measurements, the multi-proxy B-TF most accurately reconstructs the RSL changes observed in the instrumental record (mean square error = 0.003 m2). The Bayesian hierarchical model provides a single, unifying framework for reconstructing and analyzing sea-level change through time. This approach is suitable for reconstructing other paleoenvironmental variables (e.g., temperature) using biological proxies.
Hydrogen is the main constitute of plasmas in HANBIT magnetic mirror device, therefore, measurement of the emission from excited levels of hydrogen atoms is an important diagnostic tool. From the emissivity of Hα radiation one can derive quantities such as the neutral hydrogen density and the source rate. An unbiased and consistent probability theory based approach within the framework of Bayesian inference is applied to the reconstruction of Hα emissivity profiles and hydrogen neutral density profiles in HANBIT magnetic mirror device
Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation
Qi, Jinyi
2003-01-10
Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE.
Optimization of Bayesian Emission tomographic reconstruction for region of interest quantitation
Region of interest (ROI) quantitation is an important task in emission tomography (e.g., positron emission tomography and single photon emission computed tomography). It is essential for exploring clinical factors such as tumor activity, growth rate, and the efficacy of therapeutic interventions. Bayesian methods based on the maximum a posteriori principle (or called penalized maximum likelihood methods) have been developed for emission image reconstructions to deal with the low signal to noise ratio of the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the smoothing parameter of the image prior in Bayesian reconstruction controls the resolution and noise trade-off and hence affects ROI quantitation. In this paper we present an approach for choosing the optimum smoothing parameter in Bayesian reconstruction for ROI quantitation. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Building on the recent progress on deriving the approximate expressions for the local impulse response function and the covariance matrix, we derived simplied theoretical expressions for the bias, the variance, and the ensemble mean squared error (EMSE) of the ROI quantitation. One problem in evaluating ROI quantitation is that the truth is often required for calculating the bias. This is overcome by using ensemble distribution of the activity inside the ROI and computing the average EMSE. The resulting expressions allow fast evaluation of the image quality for different smoothing parameters. The optimum smoothing parameter of the image prior can then be selected to minimize the EMSE
J. P. Werner
2015-03-01
Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.
Bayesian image reconstruction for emission tomography based on median root prior
The aim of the present study was to investigate a new type of Bayesian one-step late reconstruction method which utilizes a median root prior (MRP). The method favours images which have locally monotonous radioactivity concentrations. The new reconstruction algorithm was applied to ideal simulated data, phantom data and some patient examinations with PET. The same projection data were reconstructed with filtered back-projection (FBP) and maximum likelihood-expectation maximization (ML-EM) methods for comparison. The MRP method provided good-quality images with a similar resolution to the FBP method with a ramp filter, and at the same time the noise properties were as good as with Hann-filtered FBP images. The typical artefacts seen in FBP reconstructed images outside of the object were completely removed, as was the grainy noise inside the object. Quantitativley, the resulting average regional radioactivity concentrations in a large region of interest in images produced by the MRP method corresponded to the FBP and ML-EM results but at the pixel by pixel level the MRP method proved to be the most accurate of the tested methods. In contrast to other iterative reconstruction methods, e.g. ML-EM, the MRP method was not sensitive to the number of iterations nor to the adjustment of reconstruction parameters. Only the Bayesian parameter β had to be set. The proposed MRP method is much more simple to calculate than the methods described previously, both with regard to the parameter settings and in terms of general use. The new MRP reconstruction method was shown to produce high-quality quantitative emission images with only one parameter setting in addition to the number of iterations. (orig.)
Bayesian inference of population size history from multiple loci
Drummond Alexei J
2008-10-01
Full Text Available Abstract Background Effective population size (Ne is related to genetic variability and is a basic parameter in many models of population genetics. A number of methods for inferring current and past population sizes from genetic data have been developed since JFC Kingman introduced the n-coalescent in 1982. Here we present the Extended Bayesian Skyline Plot, a non-parametric Bayesian Markov chain Monte Carlo algorithm that extends a previous coalescent-based method in several ways, including the ability to analyze multiple loci. Results Through extensive simulations we show the accuracy and limitations of inferring population size as a function of the amount of data, including recovering information about evolutionary bottlenecks. We also analyzed two real data sets to demonstrate the behavior of the new method; a single gene Hepatitis C virus data set sampled from Egypt and a 10 locus Drosophila ananassae data set representing 16 different populations. Conclusion The results demonstrate the essential role of multiple loci in recovering population size dynamics. Multi-locus data from a small number of individuals can precisely recover past bottlenecks in population size which can not be characterized by analysis of a single locus. We also demonstrate that sequence data quality is important because even moderate levels of sequencing errors result in a considerable decrease in estimation accuracy for realistic levels of population genetic variability.
A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change
Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C
2015-01-01
We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...
Reconstructing the evolutionary history of natural languages
Warnow, T.; Ringe, D.; Taylor, A. [Univ. of Pennsylvania, Philadelphia, PA (United States)
1996-12-31
In this paper we present a new methodology for determining the evolutionary history of related languages. Our methodology uses linguistic information encoded as qualitative characters, and provides much greater precision than previous methods. Our analysis of Indo-European (IE) languages resolves questions that have troubled scholars for over a century.
Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models
Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also
Accurate reconstruction of insertion-deletion histories by statistical phylogenetics.
Oscar Westesson
Full Text Available The Multiple Sequence Alignment (MSA is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history, it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of several methods for reconstruction of indel history. The methods tested include a relatively new algorithm for statistical marginalization of MSAs that sums over a stochastically-sampled ensemble of the most probable evolutionary histories. For mammalian evolutionary parameters on several different trees, the single most likely history sampled by our algorithm appears less biased than histories reconstructed by other MSA methods. The algorithm can also be used for alignment-free inference, where the MSA is explicitly summed out of the analysis. As an illustration of our method, we discuss reconstruction of the evolutionary histories of human protein-coding genes.
Laplanche, Christophe
2012-01-01
International audience The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error st...
Dries, M; Koopmans, L V E
2016-01-01
Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...
Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction
Sohlberg, Antti [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Lensu, Sanna [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Department of Environmental Health, National Public Health Institute, Kuopio (Finland); Jolkkonen, Jukka [Department of Neuroscience and Neurology, University of Kuopio, Kuopio (Finland); Tuomisto, Leena [Department of Pharmacology and Toxicology, University of Kuopio, Kuopio (Finland); Ruotsalainen, Ulla [Institute of Signal Processing, DMI, Tampere University of Technology, Tampere (Finland); Kuikka, Jyrki T. [Department of Clinical Physiology and Nuclear Medicine, Kuopio University Hospital, P.O. Box 1777, 70211, Kuopio (Finland); Niuvanniemi Hospital, Kuopio (Finland)
2004-07-01
The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with {sup 123}I-epidepride and {sup 99m}Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)
Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction
The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with 123I-epidepride and 99mTc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. (orig.)
Improving the quality of small animal brain pinhole SPECT imaging by Bayesian reconstruction.
Sohlberg, Antti; Lensu, Sanna; Jolkkonen, Jukka; Tuomisto, Leena; Ruotsalainen, Ulla; Kuikka, Jyrki T
2004-07-01
The possibility of using existing hardware makes pinhole single-photon emission computed tomography (SPECT) attractive when pursuing the ultra-high resolution required for small animal brain imaging. Unfortunately, the poor sensitivity and the heavy weight of the collimator hamper the use of pinhole SPECT in animal studies by generating noisy and misaligned projections. To improve the image quality we have developed a new Bayesian reconstruction method, pinhole median root prior (PH-MRP), which prevents the excessive noise accumulation from the projections to the reconstructed image. The PH-MRP algorithm was used to reconstruct data acquired with our small animal rotating device, which was designed to reduce the rotation orbit misalignments. Phantom experiments were performed to test the device and compare the PH-MRP with the conventional Feldkamp-Davis-Kress (FDK) and pinhole ordered subsets maximum likelihood expectation maximisation (PH-OSEM) reconstruction algorithms. The feasibility of the system for small animal brain imaging was studied with Han-Wistar rats injected with (123)I-epidepride and (99m)Tc-hydroxy methylene diphosphonate. Considering all the experiments, no shape distortions due to orbit misalignments were encountered and remarkable improvements in noise characteristics and also in overall image quality were observed when the PH-MRP was applied instead of the FDK or PH-OSEM. In addition, the proposed methods utilise existing hardware and require only a certain amount of construction and programming work, making them easy to implement. PMID:14991246
An Improved Approximate-Bayesian Model-choice Method for Estimating Shared Evolutionary History
Oaks, Jamie R.
2014-01-01
Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa...
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud
2008-01-01
Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in...
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion
Zanini, Andrea; Woodbury, Allan D.
2016-02-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
In this paper, we extended our earlier work on the reconstruction of the (time-averaged) one-dimensional velocity distribution of Galactic Weakly Interacting Massive Particles (WIMPs) and introduce the Bayesian fitting procedure to the theoretically predicted velocity distribution functions. In this reconstruction process, the (rough) velocity distribution reconstructed by using raw data from direct Dark Matter detection experiments directly, i.e. measured recoil energies, with one or more different target materials, has been used as ''reconstructed-input'' information. By assuming a fitting velocity distribution function and scanning the parameter space based on the Bayesian analysis, the astronomical characteristic parameters, e.g. the Solar and Earth's Galactic velocities, will be pinned down as the output results
Laplanche, Christophe
2012-11-01
The author describes and evaluates a Bayesian method to reconstruct three-dimensional toothed whale trajectories from a series of echolocation signals. Localization by using passive acoustic data (time of arrival of source signals at receptors) is assisted by using visual data (coordinates of the whale when diving and resurfacing) and tag information (movement statistics). The efficiency of the Bayesian method is compared to the standard minimum mean squared error statistical approach by comparing the reconstruction results of 48 simulated sperm whale (Physeter macrocephalus) trajectories. The use of the advanced Bayesian method reduces bias (standard deviation) with respect to the standard method up to a factor of 8.9 (13.6). The author provides open-source software which is functional with acoustic data which would be collected in the field from any three-dimensional receptor array design. This approach renews passive acoustics as a valuable tool to study the underwater behavior of toothed whales. PMID:23145606
Modified gravity and its reconstruction from the universe expansion history
Nojiri, S; Nojiri, Shin'ichi; Odintsov, Sergei D.
2006-01-01
We develop the reconstruction program for the number of modified gravities: scalar-tensor theory, $f(R)$, $F(G)$ and string-inspired, scalar-Gauss-Bonnet gravity. The known (classical) universe expansion history is used for the explicit and successful reconstruction of some versions (of special form or with specific potentials) from all above modified gravities. It is demonstrated that cosmological sequence of matter dominance, decceleration-acceleration transition and acceleration era may always emerge as cosmological solutions of such theory. Moreover, the late-time dark energy FRW universe may have the approximate or exact $\\Lambda$CDM form consistent with three years WMAP data. The principal possibility to extend this reconstruction scheme to include the radiation dominated era and inflation is briefly mentioned. Finally, it is indicated how even modified gravity which does not describe the matter-dominated epoch may have such a solution before acceleration era at the price of the introduction of compensa...
Evolutionary History Reconstruction for Mammalian Complex Gene Clusters
Zhang, Yu; Song, Giltae; Vinař, Tomáš; Green, Eric D; Siepel, Adam; Miller, Webb
2009-01-01
Clusters of genes that evolved from single progenitors via repeated segmental duplications present significant challenges to the generation of a truly complete human genome sequence. Such clusters can confound both accurate sequence assembly and downstream computational analysis, yet they represent a hotbed of functional innovation, making them of extreme interest. We have developed an algorithm for reconstructing the evolutionary history of gene clusters using only human genomic sequence dat...
Reconstructing the modular recombination history of Staphylococcus aureus phages
Swenson, Krister M; Guertin, Paul; Deschênes, Hugo; Bergeron, Anne
2013-01-01
Background Viruses that infect bacteria, called phages, are well-known for their extreme mosaicism, in which an individual genome shares many different parts with many others. The mechanisms for creating these mosaics are largely unknown but are believed to be recombinations, either illegitimate, or partly homologous. In order to reconstruct the history of these recombinations, we need to identify the positions where recombinations may have occurred, and develop algorithms to generate and exp...
Reconstructing the history of dark energy using maximum entropy
Zunckel, Caroline; Trotta, Roberto
2007-09-01
We present a Bayesian technique based on a maximum-entropy method to reconstruct the dark energy equation of state (EOS) w(z) in a non-parametric way. This Maximum Entropy (MaxEnt) technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing Type Ia supernova measurement from the HST/GOODS programme and the first-year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillation data. We find that the SNLS data are compatible with w(z) = -1 at all redshifts 0 -1 at z ~ 0.5 and a drift towards w > -1 at larger redshifts which, however, is not robust with respect to changes in our prior specifications. We employ both a constant EOS prior model and a slowly varying w(z) and find that our conclusions are only mildly dependent on this choice at high redshifts. Our method highlights the danger of employing parametric fits for the unknown EOS, that can potentially miss or underestimate real structure in the data.
Reconstructing the history of dark energy using maximum entropy
Zunckel, C
2007-01-01
We present a Bayesian technique based on a maximum entropy method to reconstruct the dark energy equation of state $w(z)$ in a non--parametric way. This MaxEnt technique allows to incorporate relevant prior information while adjusting the degree of smoothing of the reconstruction in response to the structure present in the data. After demonstrating the method on synthetic data, we apply it to current cosmological data, separately analysing type Ia supernovae measurement from the HST/GOODS program and the first year Supernovae Legacy Survey (SNLS), complemented by cosmic microwave background and baryonic acoustic oscillations data. We find that the SNLS data are compatible with $w(z) = -1$ at all redshifts $0 \\leq z \\lsim 1100$, with errorbars of order 20% for the most constraining choice of priors and model. The HST/GOODS data exhibit a slight (about $1\\sigma$ significance) preference for $w>-1$ at $z\\sim 0.5$ and a drift towards $w>-1$ at larger redshifts, which however is not robust with respect to changes ...
In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 5123 to 81923 voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and Ht (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections
Bai, Ying; Lan, JieQin; Gao, WeiWei
2016-01-01
A toy detector array has been designed to simulate the detection of cosmic rays in Extended Air Shower(EAS) Experiments for ground-based TeV Astrophysics. The primary energies of protons from the Monte-Carlo simulation have been reconstructed by the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment\\cite{lhaaso-ma}, respectively. The result of the energy reconstruction using BNNs has been compared with the one using the standard method. Compared to the standard method, the energy resolutions are significantly improved using BNNs. And the improvement is more obvious for the high energy protons than the low energy ones.
Benchmarking the Bayesian reconstruction of the non-perturbative heavy $Q\\bar{Q}$ potential
Burnier, Yannis
2013-01-01
The extraction of the finite temperature heavy quark potential from lattice QCD relies on a spectral analysis of the real-time Wilson loop. Through its position and shape, the lowest lying spectral peak encodes the real and imaginary part of this complex potential. We benchmark this extraction strategy using leading order hard-thermal loop (HTL) calculations. I.e. we analytically calculate the Wilson loop and determine the corresponding spectrum. By fitting its lowest lying peak we obtain the real- and imaginary part and confirm that the knowledge of the lowest peak alone is sufficient for obtaining the potential. We deploy a novel Bayesian approach to the reconstruction of spectral functions to HTL correlators in Euclidean time and observe how well the known spectral function and values for the real and imaginary part are reproduced. Finally we apply the method to quenched lattice QCD data and perform an improved estimate of both real and imaginary part of the non-perturbative heavy $Q\\bar{Q}$ potential.
We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an open-quotes image-modelingclose quotes prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach
Chan, M.T. [Univ. of Southern California, Los Angeles, CA (United States); Herman, G.T. [Univ. of Pennsylvania, Philadelphia, PA (United States); Levitan, E. [Technion, Haifa (Israel)
1996-12-31
We demonstrate that (i) classical methods of image reconstruction from projections can be improved upon by considering the output of such a method as a distorted version of the original image and applying a Bayesian approach to estimate from it the original image (based on a model of distortion and on a Gibbs distribution as the prior) and (ii) by selecting an {open_quotes}image-modeling{close_quotes} prior distribution (i.e., one which is such that it is likely that a random sample from it shares important characteristics of the images of the application area) one can improve over another Gibbs prior formulated using only pairwise interactions. We illustrate our approach using simulated Positron Emission Tomography (PET) data from realistic brain phantoms. Since algorithm performance ultimately depends on the diagnostic task being performed. we examine a number of different medically relevant figures of merit to give a fair comparison. Based on a training-and-testing evaluation strategy, we demonstrate that statistically significant improvements can be obtained using the proposed approach.
Werhli, Adriano V; Husmeier, Dirk
2008-06-01
There have been various attempts to improve the reconstruction of gene regulatory networks from microarray data by the systematic integration of biological prior knowledge. Our approach is based on pioneering work by Imoto et al. where the prior knowledge is expressed in terms of energy functions, from which a prior distribution over network structures is obtained in the form of a Gibbs distribution. The hyperparameters of this distribution represent the weights associated with the prior knowledge relative to the data. We have derived and tested a Markov chain Monte Carlo (MCMC) scheme for sampling networks and hyperparameters simultaneously from the posterior distribution, thereby automatically learning how to trade off information from the prior knowledge and the data. We have extended this approach to a Bayesian coupling scheme for learning gene regulatory networks from a combination of related data sets, which were obtained under different experimental conditions and are therefore potentially associated with different active subpathways. The proposed coupling scheme is a compromise between (1) learning networks from the different subsets separately, whereby no information between the different experiments is shared; and (2) learning networks from a monolithic fusion of the individual data sets, which does not provide any mechanism for uncovering differences between the network structures associated with the different experimental conditions. We have assessed the viability of all proposed methods on data related to the Raf signaling pathway, generated both synthetically and in cytometry experiments. PMID:18574862
Van Nguyen, Linh; Chainais, Pierre
2015-01-01
The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities ar...
Reconstructing the invasion history of Heracleum persicum (Apiaceae) into Europe
Rijal, D. P.; Alm, T.; Jahodová, Šárka; Stenoien, H. K.; Alsos, I. G.
2015-01-01
Roč. 24, č. 22 (2015), s. 5522-5543. ISSN 0962-1083 Institutional support: RVO:67985939 Keywords : approximate Bayesian computation * genetic variation * population genetics Subject RIV: EH - Ecology, Behaviour Impact factor: 6.494, year: 2014
Reconstructing the Population Genetic History of the Caribbean
Moreno-Estrada, Andrés; Gravel, Simon; Zakharia, Fouad; McCauley, Jacob L.; Byrnes, Jake K.; Gignoux, Christopher R.; Ortiz-Tello, Patricia A.; Martínez, Ricardo J.; Hedges, Dale J.; Morris, Richard W.; Eng, Celeste; Sandoval, Karla; Acevedo-Acevedo, Suehelay; Norman, Paul J.; Layrisse, Zulay; Parham, Peter; Martínez-Cruzado, Juan Carlos; Burchard, Esteban González; Cuccaro, Michael L.; Martin, Eden R.; Bustamante, Carlos D.
2013-01-01
The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola), two mainland (Honduras, Colombia), and three Native South American (Yukpa, Bari, and Warao) populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA) method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse—which today is reflected by shorter, older ancestry tracts—consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse—reflected by longer, younger tracts—is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub
Cai, C. [CEA, LIST, 91191 Gif-sur-Yvette, France and CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Rodet, T.; Mohammad-Djafari, A. [CNRS, SUPELEC, UNIV PARIS SUD, L2S, 3 rue Joliot-Curie, 91192 Gif-sur-Yvette (France); Legoupil, S. [CEA, LIST, 91191 Gif-sur-Yvette (France)
2013-11-15
Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also
Bai, Y.; Xu, Y.; Pan, J.; Lan, J. Q.; Gao, W. W.
2016-07-01
A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers than the low energy ones.
Shiraki, Yoshifumi; Kabashima, Yoshiyuki
2016-06-01
A signal model called joint sparse model 2 (JSM-2) or the multiple measurement vector problem, in which all sparse signals share their support, is important for dealing with practical signal processing problems. In this paper, we investigate the typical reconstruction performance of noisy measurement JSM-2 problems for {{\\ell}2,1} -norm regularized least square reconstruction and the Bayesian optimal reconstruction scheme in terms of mean square error. Employing the replica method, we show that these schemes, which exploit the knowledge of the sharing of the signal support, can recover the signals more precisely as the number of channels increases. In addition, we compare the reconstruction performance of two different ensembles of observation matrices: one is composed of independent and identically distributed random Gaussian entries and the other is designed so that row vectors are orthogonal to one another. As reported for the single-channel case in earlier studies, our analysis indicates that the latter ensemble offers better performance than the former ones for the noisy JSM-2 problem. The results of numerical experiments with a computationally feasible approximation algorithm we developed for this study agree with the theoretical estimation.
Grzegorczyk, M.; Husmeier, D.
2009-01-01
Feedback loops and recurrent structures are essential to the regulation and stable control of complex biological systems. The application of dynamic as opposed to static Bayesian networks is promising in that, in principle, these feedback loops can be learned. However, we show that the widely applied BGe score is susceptible to learning spurious feedback loops, which are a consequence of non-linear regulation and autocorrelation in the data. We propose a non-linear generalisation of the BGe m...
Reconstruction of large-scale gene regulatory networks using Bayesian model averaging.
Kim, Haseong; Gelenbe, Erol
2012-09-01
Gene regulatory networks provide the systematic view of molecular interactions in a complex living system. However, constructing large-scale gene regulatory networks is one of the most challenging problems in systems biology. Also large burst sets of biological data require a proper integration technique for reliable gene regulatory network construction. Here we present a new reverse engineering approach based on Bayesian model averaging which attempts to combine all the appropriate models describing interactions among genes. This Bayesian approach with a prior based on the Gibbs distribution provides an efficient means to integrate multiple sources of biological data. In a simulation study with maximum of 2000 genes, our method shows better sensitivity than previous elastic-net and Gaussian graphical models, with a fixed specificity of 0.99. The study also shows that the proposed method outperforms the other standard methods for a DREAM dataset generated by nonlinear stochastic models. In brain tumor data analysis, three large-scale networks consisting of 4422 genes were built using the gene expression of non-tumor, low and high grade tumor mRNA expression samples, along with DNA-protein binding affinity information. We found that genes having a large variation of degree distribution among the three tumor networks are the ones that see most involved in regulatory and developmental processes, which possibly gives a novel insight concerning conventional differentially expressed gene analysis. PMID:22987132
Accurate Reconstruction of Insertion-Deletion Histories by Statistical Phylogenetics
Westesson, O; Lunter, G.; Paten, B; Holmes, I
2012-01-01
The Multiple Sequence Alignment (MSA) is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history), it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm) to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of seve...
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole; Hansen, Lars Kai
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and...
Reconstructing a School's Past Using Oral Histories and GIS Mapping.
Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna
2000-01-01
Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)
Wahl, E. R.
2008-12-01
A strict process model for pollen as a climate proxy is currently not approachable beyond localized spatial scales; more generally, the canonical model for vegetation-pollen registration itself requires assimilation of empirically-derived information. In this paper, a taxonomically "reduced-space" climate-pollen forward model is developed, based on the performance of a parallel inverse model. The goal is inclusion of the forward model in a Bayesian climate reconstruction framework, following a 4-step process. (1) Ratios of pollen types calibrated to temperature are examined to determine if they can equal or surpass the skill of multi-taxonomic calibrations using the modern analog technique (MAT) optimized with receiver operating characteristic (ROC) analysis. The first phase of this examination, using modern pollen data from SW N America, demonstrates that the ratio method can give calibrations as skillful as the MAT when vegetation representation (and associated climate gradients) are characterized by two dominant pollen taxa, in this case pine and oak. Paleotemperature reconstructions using the ratio method also compare well to MAT reconstructions, showing very minor differences. [Ratio values are defined as pine/(pine + oak), so they vary between 0 and 1.] (2) Uncertainty analysis is carried out in independent steps, which are combined to give overall probabilistic confidence ranges. Monte Carlo (MC) analysis utilizing Poisson distributions to model the inherent variability of pollen representation in relation to climate (assuming defined temperature normals at the modern calibration sites) allows independent statistical estimation of this component of uncertainty, for both the modern calibration and fossil pollen data sets. In turn, MC analysis utilizing normal distributions allows independent estimation of the addition to overall uncertainty from climate variation itself. (3) Because the quality tests in (1) indicate the ratio method has the capacity to carry
Holocene fire history reconstruction using Tibetan lacustrine sediments
Callegaro, Alice; Kirchgeorg, Torben; Battistel, Dario; Bird, Broxton; Barbante, Carlo
2016-04-01
The important role that biomass burning playsin influencing the Holocene'sclimate is still under discussion. The present work gives information about past biomass burning events in the Tibetan Plateau and helps to increase the understanding of the interaction between climate, humans and fire activity during Holocene. Asiatic area is one of the centers of the advent of agriculture and pastoralism, and it is a strategic area for understanding the interaction between human and fire during the Holocene. We reconstructed past biomass burning events and vegetation from sediments collected from lake Paru Co, a small moraine dammed lake located in the Tibetan Plateau at 4845 m above sea level. We extracted lake sediment samples by accelerate solvent extraction and analysed different organic molecular proxies by GC-MS and IC-MS. We used monosaccharide anhydrides, levoglucosan and its isomers, as proxies for biomass burning. These are specific molecular markers originated from the pyrolysis of cellulose showing significant fire events and indicate changes in burned fuel. Furthermore we analysed polycyclic aromatic hydrocarbons (PAH) as additional combustion proxies. For a better understanding of changes in vegetation andof human habitation at the lake shore we analysed n-alkanes and sterols. Comparing the data of this multi-proxy approach used in the studied area with climatic and meteorological literature data, reconstruction and contextualization of past fire events are possible: we can see the agreement between dry climate period and presence of more intense fire events, especially in the Early Holocene.
Free Radicals in Organic Matter for Thermal History Reconstruction of Carbonate Succession
无
2007-01-01
Geothermometer is one of the most useful methods to reconstruct the thermal history of sedimentary basins. This paper introduces the application of free radicals concentration of organic matter as a thermal indicator in the thermal history reconstruction of carbonate succession, based on anhydrous thermal simulation results of type Ⅰ and Ⅱ1 kerogen. A series of free radicals data are obtained under thermal simulation of different heating temperatures and times, and quantitative models between free radical concentration (Ng) of organic matter and time-temperature index (TTI) for types Ⅰ and type Ⅱ1 kerogen are also obtained. This Ng- TTI relation was used to model the Ordovician thermal gradients of Well TZ12 in the Tarim Basin. The modeling result is corresponding to the results obtained by apatite fission track data and published data. This new method of thermal history reconstruction will be benefit to the hydrocarbon generation and accumulation study and resource assessment of carbonate succession.
Moriya, Toshio; Acar, Erman; Cheng, R Holland; Ruotsalainen, Ulla
2015-09-01
In the single particle reconstruction, the initial 3D structure often suffers from the limited angular sampling artifact. Selecting 2D class averages of particle images generally improves the accuracy and efficiency of the reference-free 3D angle estimation, but causes an insufficient angular sampling to fill the information of the target object in the 3D frequency space. Similarly, the initial 3D structure by the random-conical tilt reconstruction has the well-known "missing cone" artifact. Here, we attempted to solve the limited angular sampling problem by sequentially applying maximum a posteriori estimate with expectation maximization algorithm (sMAP-EM). Using both simulated and experimental cryo-electron microscope images, the sMAP-EM was compared to the direct Fourier method on the basis of reconstruction error and resolution. To establish selection criteria of the final regularization weight for the sMAP-EM, the effects of noise level and sampling sparseness on the reconstructions were examined with evenly distributed sampling simulations. The frequency information filled in the missing cone of the conical tilt sampling simulations was assessed by developing new quantitative measurements. All the results of visual and numerical evaluations showed the sMAP-EM performed better than the direct Fourier method, regardless of the sampling method, noise level, and sampling sparseness. Furthermore, the frequency domain analysis demonstrated that the sMAP-EM can fill the meaningful information in the unmeasured angular space without detailed a priori knowledge of the objects. The current research demonstrated that the sMAP-EM has a high potential to facilitate the determination of 3D protein structures at near atomic-resolution. PMID:26193484
MAGIC: Exact Bayesian Covariance Estimation and Signal Reconstruction for Gaussian Random Fields
Wandelt, Benjamin D.
2004-01-01
In this talk I describe MAGIC, an efficient approach to covariance estimation and signal reconstruction for Gaussian random fields (MAGIC Allows Global Inference of Covariance). It solves a long-standing problem in the field of cosmic microwave background (CMB) data analysis but is in fact a general technique that can be applied to noisy, contaminated and incomplete or censored measurements of either spatial or temporal Gaussian random fields. In this talk I will phrase the method in a way th...
Collage as a Way to Reconstruct the History of Education
Elena Penskaja
2012-01-01
Elena Penskaja, D.Sc. in Philology, Dean of the Faculty of Philology, Head of the Literature Department, National Research University - Higher School of Economics, Moscow, Russian Federation. Email: The paper reviews the methods used to describe historical processes in education and analyzes examples of historical material falsification.The lack of qualitative studies in history of Russian education is a great hindrance to educational reforms. The author identifies and descr...
Reconstructing the history of structure formation using redshift distortions
Song, Yong-Seon; Percival, Will
2008-01-01
Measuring the statistics of galaxy peculiar velocities using redshift-space distortions is an excellent way of probing the history of structure formation. Because galaxies are expected to act as test particles within the flow of matter, this method avoids uncertainties due to an unknown galaxy density bias. We show that the parameter combination measured by redshift-space distortions, $f\\sigma_8^{\\rm mass}$ provides a good test of dark energy models, even without the knowledge of bias or $\\si...
Bayesian approach to spectral function reconstruction for Euclidean quantum field theories.
Burnier, Yannis; Rothkopf, Alexander
2013-11-01
We present a novel approach to the inference of spectral functions from Euclidean time correlator data that makes close contact with modern Bayesian concepts. Our method differs significantly from the maximum entropy method (MEM). A new set of axioms is postulated for the prior probability, leading to an improved expression, which is devoid of the asymptotically flat directions present in the Shanon-Jaynes entropy. Hyperparameters are integrated out explicitly, liberating us from the Gaussian approximations underlying the evidence approach of the maximum entropy method. We present a realistic test of our method in the context of the nonperturbative extraction of the heavy quark potential. Based on hard-thermal-loop correlator mock data, we establish firm requirements in the number of data points and their accuracy for a successful extraction of the potential from lattice QCD. Finally we reinvestigate quenched lattice QCD correlators from a previous study and provide an improved potential estimation at T=2.33T(C). PMID:24237510
Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach
We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with Nf = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV < T < 249MeV and deploys lattice regularized non-relativistic QCD (NRQCD) effective field theory to capture the physics of heavy quark bound states immersed therein. The spectral functions of the 3S1 (ϒ) and 3P1 (χb1) bottomonium states are extracted from Euclidean time Monte Carlo simulations using a novel Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χb1) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided
Sujit k. sahu
2005-01-01
Full Text Available In response to the drought experienced in Southern Italy a rain seeding project has been setup and developed during the years 1989-1994. The initiative was taken with the purpose of applying existing methods of rain enhancement technology to regions of south Italy including Puglia. The aim of this study is to provide statistical support for the evaluation of the experimental part of the project. In particular our aim is to reconstruct rainfall fields by combining two data sources: rainfall intensity as measured by ground raingauges and radar reflectivity. A difficulty in modeling the rainfall data here comes from rounding of many recorded rainguages. The rounding of the rainfall measurements make the data essentially discrete and models based on continuous distributions are not suitable for modeling these discrete data. In this study we extend two recently developed spatio-temporal models for continuous data to accommodate rounded rainfall measurements taking discrete values with positive probabilities. We use MCMC methods to implement the models and obtain forecasts in space and time together with their standard errors. We compare the two models using predictive Bayesian methods. The benefits of our modeling extensions are seen in accurate predictions of dry periods with no positive prediction standard errors.
Alexander Tilley
Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.
Tilley, Alexander; López-Angarita, Juliana; Turner, John R
2013-01-01
The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15)N and (13)C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15)N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15)N values and greater δ(13)C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15)N ≈ 2.7‰ and Δ(13)C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions. PMID:24236144
Chang, Joshua C; Chou, Tom
2015-01-01
Quantifying the forces between and within macromolecules is a necessary first step in understanding the mechanics of molecular structure, protein folding, and enzyme function and performance. In such macromolecular settings, dynamic single-molecule force spectroscopy (DFS) has been used to distort bonds. The resulting responses, in the form of rupture forces, work applied, and trajectories of displacements, have been used to reconstruct bond potentials. Such approaches often rely on simple parameterizations of one-dimensional bond potentials, assumptions on equilibrium starting states, and/or large amounts of trajectory data. Parametric approaches typically fail at inferring complex-shaped bond potentials with multiple minima, while piecewise estimation may not guarantee smooth results with the appropriate behavior at large distances. Existing techniques, particularly those based on work theorems, also do not address spatial variations in the diffusivity that may arise from spatially inhomogeneous coupling to...
Niaz, Mansoor; Klassen, Stephen; McMillan, Barbara; Metz, Don
2010-01-01
The photoelectric effect is an important part of general physics textbooks. To study the presentation of this phenomenon, we have reconstructed six essential, history and philosophy of science (HPS)-related aspects of the events that culminated in Einstein proposing his hypothesis of lightquanta and the ensuing controversy within the scientific…
Reconstruction to Progressivism: Booklet 3. Critical Thinking in American History. [Student Manual].
O'Reilly, Kevin
One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents high school students with supplementary lessons on the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The student booklet begins with a guide to critical…
O'Reilly, Kevin
One of a series of curriculum materials in U.S. history designed to teach critical thinking skills systematically, this teacher's guide presents supplementary lesson plans for teaching high school students about the Reconstruction period, industrialism, labor and immigration, and progressivism and populism. The booklet begins with a guide to…
Whole-History Rating: A Bayesian Rating System for Players of Time-Varying Strength
Coulom, Rémi
2008-01-01
Whole-History Rating (WHR) is a new method to estimate the time-varying strengths of players involved in paired comparisons. Like many variations of the Elo rating system, the whole-history approach is based on the dynamic Bradley-Terry model. But, instead of using incremental approximations, WHR directly computes the exact maximum a posteriori over the whole rating history of all players. This additional accuracy comes at a higher computational cost than traditional methods, but computation ...
Reconstructing the history of structure formation using redshift distortions
Measuring the statistics of galaxy peculiar velocities using redshift-space distortions is an excellent way of probing the history of structure formation. Because galaxies are expected to act as test particles within the flow of matter, this method avoids uncertainties due to an unknown galaxy density bias. We show that the parameter combination measured by redshift-space distortions, fσ8mass provides a good test of dark energy models, even without the knowledge of bias or σ8mass required to extract f from this measurement (here f is the logarithmic derivative of the linear growth rate, and σ8mass is the root-mean-square mass fluctuation in spheres with radius 8h−1Mpc). We argue that redshift-space distortion measurements will help to determine the physics behind the cosmic acceleration, testing whether it is related to dark energy or modified gravity, and will provide an opportunity to test possible dark energy clumping or coupling between dark energy and dark matter. If we can measure galaxy bias in addition, simultaneous measurement of both the overdensity and velocity fields can be used to test the validity of equivalence principle, through the continuity equation
Metagenomic reconstructions of bacterial CRISPR loci constrain population histories.
Sun, Christine L; Thomas, Brian C; Barrangou, Rodolphe; Banfield, Jillian F
2016-04-01
Bacterial CRISPR-Cas systems provide insight into recent population history because they rapidly incorporate, in a unidirectional manner, short fragments (spacers) from coexisting infective virus populations into host chromosomes. Immunity is achieved by sequence identity between transcripts of spacers and their targets. Here, we used metagenomics to study the stability and dynamics of the type I-E CRISPR-Cas locus of Leptospirillum group II bacteria in biofilms sampled over 5 years from an acid mine drainage (AMD) system. Despite recovery of 452,686 spacers from CRISPR amplicons and metagenomic data, rarefaction curves of spacers show no saturation. The vast repertoire of spacers is attributed to phage/plasmid population diversity and retention of old spacers, despite rapid evolution of the targeted phage/plasmid genome regions (proto-spacers). The oldest spacers (spacers found at the trailer end) are conserved for at least 5 years, and 12% of these retain perfect or near-perfect matches to proto-spacer targets. The majority of proto-spacer regions contain an AAG proto-spacer adjacent motif (PAM). Spacers throughout the locus target the same phage population (AMDV1), but there are blocks of consecutive spacers without AMDV1 target sequences. Results suggest long-term coexistence of Leptospirillum with AMDV1 and periods when AMDV1 was less dominant. Metagenomics can be applied to millions of cells in a single sample to provide an extremely large spacer inventory, allow identification of phage/plasmids and enable analysis of previous phage/plasmid exposure. Thus, this approach can provide insights into prior bacterial environment and genetic interplay between hosts and their viruses. PMID:26394009
Fortunato, Laura
2011-02-01
Explanations for the emergence of monogamous marriage have focused on the cross-cultural distribution of marriage strategies, thus failing to account for their history. In this paper I reconstruct the pattern of change in marriage strategies in the history of societies speaking Indo-European languages, using cross-cultural data in the systematic and explicitly historical framework afforded by the phylogenetic comparative approach. The analysis provides evidence in support of Proto-Indo-European monogamy, and that this pattern may have extended back to Proto-Indo-Hittite. These reconstructions push the origin of monogamous marriage into prehistory, well beyond the earliest instances documented in the historical record; this, in turn, challenges notions that the cross-cultural distribution of monogamous marriage reflects features of social organization typically associated with Eurasian societies, and with "societal complexity" and "modernization" more generally. I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization. PMID:21453006
Memory, History and Narrative: Shifts of Meaning when (Reconstructing the Past
Ignacio Brescó de Luna
2012-05-01
Full Text Available This paper is devoted to the examination of some socio-cultural dimensions of memory, focusing on narratives as a meditational tool (Vygotsky, 1978 for the construction of past events and attribution of meaning. The five elements of Kenneth Burke’s Grammar of Motives (1969 are taken as a framework for the examination of reconstructions of the past and particularly of histories, namely: 1 the interpretative and reconstructive action of 2 a positioned agent operating 3 through narrative means 4 addressed to particular purposes 5 within a concrete social and temporal scenery. The reflexive character of such approach opens the ground for considering remembering as one kind of act performed within the context of a set of on-going actions, so that remembrances play a directive role for action and so have an unavoidable moral dimension. This is particularly relevant for some kinds of social memory such as history teaching and their effects upon identity.
Stouthamer, E.; Cohen, K.M.; Hoek, W.Z.; Pierik, H.J.; Taal, L.J.; Hijma, M.P.; Bos, I.J.
2013-01-01
In the Holocene Rhine-Meuse delta, the geography, architecture, and chronology of the channel belts and their flood basins is known in exceptional high detail. This is due to a long history of intensive geological, geomorphological, and archeological research by various universities and knowledge institutes and archaeological consultancy companies. A first reconstruction showing the build-up and palaeogeographical development of the delta in 500 year time-slices was published in 2001 by Beren...
Reconstructing a website's lost past: Methodological issues concerning the history of www.unibo.it
Nanni, Federico
2016-01-01
This paper describes how born digital primary sources could be used to reconstruct the recent history of scientific institutions. The case study is an analysis of the first 25 years online of the University of Bologna. The focus of this work is primarily methodological: several different issues are presented, starting with the fact that the University of Bologna website has been excluded for thirteen years from the Internet Archive's Wayback Machine, and possible solutions are proposed and ap...
Konečný, A.; Bryja, Josef
Brno: Ústav biologie obratlovců AV ČR, 2012 - (Bryja, J.; Albrechtová, J.; Tkadlec, E.). s. 97 ISBN 978-80-87189-11-5. [Zoologické dny. 09.02.2012-10.02.2012, Olomouc] R&D Projects: GA ČR GAP506/10/0983 Institutional research plan: CEZ:AV0Z60930519 Keywords : colonization history * black rat Subject RIV: EG - Zoology
Takebayashi Naoki
2007-07-01
Full Text Available Abstract Background Although testing for simultaneous divergence (vicariance across different population-pairs that span the same barrier to gene flow is of central importance to evolutionary biology, researchers often equate the gene tree and population/species tree thereby ignoring stochastic coalescent variance in their conclusions of temporal incongruence. In contrast to other available phylogeographic software packages, msBayes is the only one that analyses data from multiple species/population pairs under a hierarchical model. Results msBayes employs approximate Bayesian computation (ABC under a hierarchical coalescent model to test for simultaneous divergence (TSD in multiple co-distributed population-pairs. Simultaneous isolation is tested by estimating three hyper-parameters that characterize the degree of variability in divergence times across co-distributed population pairs while allowing for variation in various within population-pair demographic parameters (sub-parameters that can affect the coalescent. msBayes is a software package consisting of several C and R programs that are run with a Perl "front-end". Conclusion The method reasonably distinguishes simultaneous isolation from temporal incongruence in the divergence of co-distributed population pairs, even with sparse sampling of individuals. Because the estimate step is decoupled from the simulation step, one can rapidly evaluate different ABC acceptance/rejection conditions and the choice of summary statistics. Given the complex and idiosyncratic nature of testing multi-species biogeographic hypotheses, we envision msBayes as a powerful and flexible tool for tackling a wide array of difficult research questions that use population genetic data from multiple co-distributed species. The msBayes pipeline is available for download at http://msbayes.sourceforge.net/ under an open source license (GNU Public License. The msBayes pipeline is comprised of several C and R programs that
Kevin McNally
2012-01-01
Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.
Barrès, B; Carlier, J; Seguin, M; Fenouillet, C; Cilas, C; Ravigné, V
2012-11-01
Understanding the processes by which new diseases are introduced in previously healthy areas is of major interest in elaborating prevention and management policies, as well as in understanding the dynamics of pathogen diversity at large spatial scale. In this study, we aimed to decipher the dispersal processes that have led to the emergence of the plant pathogenic fungus Microcyclus ulei, which is responsible for the South American Leaf Blight (SALB). This fungus has devastated rubber tree plantations across Latin America since the beginning of the twentieth century. As only imprecise historical information is available, the study of population evolutionary history based on population genetics appeared most appropriate. The distribution of genetic diversity in a continental sampling of four countries (Brazil, Ecuador, Guatemala and French Guiana) was studied using a set of 16 microsatellite markers developed specifically for this purpose. A very strong genetic structure was found (F(st)=0.70), demonstrating that there has been no regular gene flow between Latin American M. ulei populations. Strong bottlenecks probably occurred at the foundation of each population. The most likely scenario of colonization identified by the Approximate Bayesian Computation (ABC) method implemented in DIYABC suggested two independent sources from the Amazonian endemic area. The Brazilian, Ecuadorian and Guatemalan populations might stem from serial introductions through human-mediated movement of infected plant material from an unsampled source population, whereas the French Guiana population seems to have arisen from an independent colonization event through spore dispersal. PMID:22828899
Gene Transfer and the Reconstruction of Life's Early History from Genomic Data
Gogarten, J. Peter; Fournier, Gregory; Zhaxybayeva, Olga
2008-03-01
The metaphor of the unique and strictly bifurcating tree of life, suggested by Charles Darwin, needs to be replaced (or at least amended) to reflect and include processes that lead to the merging of and communication between independent lines of descent. Gene histories include and reflect processes such as gene transfer, symbioses and lineage fusion. No single molecule can serve as a proxy for the tree of life. Individual gene histories can be reconstructed from the growing molecular databases containing sequence and structural information. With some simplifications these gene histories can be represented by furcating trees; however, merging these gene histories into web-like organismal histories, including the transfer of metabolic pathways and cell biological innovations from now-extinct lineages, has yet to be accomplished. Because of these difficulties in interpreting the record retained in molecular sequences, correlations with biochemical fossils and with the geological record need to be interpreted with caution. Advances to detect and pinpoint transfer events promise to untangle at least a few of the intertwined histories of individual genes within organisms and trace them to the organismal ancestors. Furthermore, analysis of the shape of molecular phylogenetic trees may point towards organismal radiations that might reflect early mass extinction events that occurred on a planetary scale.
Simon Boitard
2016-03-01
Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.
Fortunato, Laura
2011-02-01
Linguists and archaeologists have used reconstructions of early Indo-European residence strategies to constrain hypotheses about the homeland and trajectory of dispersal of Indo-European languages; however, these reconstructions are largely based on unsystematic and a historical use of the linguistic and ethnographic evidence, coupled with substantial bias in interpretation. Here I use cross-cultural data in a phylogenetic comparative framework to reconstruct the pattern of change in residence strategies in the history of societies speaking Indo-European languages. The analysis provides evidence in support of prevailing virilocality with alternative neolocality for Proto-Indo-European, and that this pattern may have extended back to Proto-Indo-Hittite. These findings bolster interpretations of the archaeological evidence that emphasize the "non-matricentric" structure of early Indo-European society; however, they also counter the notion that early Indo-European society was strongly "patricentric." I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization. PMID:21453007
Thomas Katagis
2014-06-01
Full Text Available In this study, the capability of geographic object-based image analysis (GEOBIA in the reconstruction of the recent fire history of a typical Mediterranean area was investigated. More specifically, a semi-automated GEOBIA procedure was developed and tested on archived and newly acquired Landsat Multispectral Scanner (MSS, Thematic Mapper (TM, and Operational Land Imager (OLI images in order to accurately map burned areas in the Mediterranean island of Thasos. The developed GEOBIA ruleset was built with the use of the TM image and then applied to the other two images. This process of transferring the ruleset did not require substantial adjustments or any replacement of the initially selected features used for the classification, thus, displaying reduced complexity in processing the images. As a result, burned area maps of very high accuracy (over 94% overall were produced. In addition to the standard error matrix, the employment of additional measures of agreement between the produced maps and the reference data revealed that “spatial misplacement” was the main source of classification error. It can be concluded that the proposed approach can be potentially used for reconstructing the recent (40-year fire history in the Mediterranean, based on extended time series of Landsat or similar data.
Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata
2016-01-01
This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...
A general reconstruction of the recent expansion history of the universe
Vitenti, S. D. P.; Penna-Lima, M.
2015-09-01
Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary (n+1)-knots spline interpolation. We carry out a Monte Carlo (MC) analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and H(z) data. The bias-variance analysis is done for three fiducial models with different features in the deceleration curve. We perform the MC analysis generating mock catalogs and computing their best-fit. For each fiducial model, we test different reconstructions using, in each case, more than 104 catalogs in a total of about 5× 105. This investigation proved to be essential in determining the best reconstruction to study these data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most 10% of the total uncertainty. In all statistical analyses, we fit the coefficients of the deceleration function along with four nuisance parameters of the supernova astrophysical model. For the full sample, we also fit H0 and the sound horizon rs(zd) at the drag redshift. The bias-variance trade-off analysis shows that, apart from the deceleration function, all other estimators are unbiased. Finally, we apply the Ensemble Sampler Markov Chain Monte Carlo (ESMCMC) method to explore the posterior of the deceleration function up to redshift 1.3 (using only JLA) and 2.3 (JLA
Wei Song Hwang
Full Text Available Assassin bugs are one of the most successful clades of predatory animals based on their species numbers (∼6,800 spp. and wide distribution in terrestrial ecosystems. Various novel prey capture strategies and remarkable prey specializations contribute to their appeal as a model to study evolutionary pathways involved in predation. Here, we reconstruct the most comprehensive reduviid phylogeny (178 taxa, 18 subfamilies to date based on molecular data (5 markers. This phylogeny tests current hypotheses on reduviid relationships emphasizing the polyphyletic Reduviinae and the blood-feeding, disease-vectoring Triatominae, and allows us, for the first time in assassin bugs, to reconstruct ancestral states of prey associations and microhabitats. Using a fossil-calibrated molecular tree, we estimated divergence times for key events in the evolutionary history of Reduviidae. Our results indicate that the polyphyletic Reduviinae fall into 11-14 separate clades. Triatominae are paraphyletic with respect to the reduviine genus Opisthacidius in the maximum likelihood analyses; this result is in contrast to prior hypotheses that found Triatominae to be monophyletic or polyphyletic and may be due to the more comprehensive taxon and character sampling in this study. The evolution of blood-feeding may thus have occurred once or twice independently among predatory assassin bugs. All prey specialists evolved from generalist ancestors, with multiple evolutionary origins of termite and ant specializations. A bark-associated life style on tree trunks is ancestral for most of the lineages of Higher Reduviidae; living on foliage has evolved at least six times independently. Reduviidae originated in the Middle Jurassic (178 Ma, but significant lineage diversification only began in the Late Cretaceous (97 Ma. The integration of molecular phylogenetics with fossil and life history data as presented in this paper provides insights into the evolutionary history of
Alperson-Afil, Nira
2012-07-01
Concepts which are common in the reconstruction of fire histories are employed here for the purpose of interpreting fires identified at archaeological sites. When attempting to evaluate the fire history of ancient occupations we are limited by the amount and quality of the available data. Furthermore, the identification of archaeological burned materials, such as stone, wood, and charcoal, is adequate for the general assumption of a "fire history", but the agent responsible - anthropogenic or natural - cannot be inferred from the mere presence of burned items. The large body of scientific data that has accumulated, primarily through efforts to prevent future fire disasters, enables us to reconstruct scenarios of past natural fires. Adopting this line of thought, this paper attempts to evaluate the circumstances in which a natural fire may have ignited and spread at the 0.79 Ma occupation site of Gesher Benot Ya'aqov (Israel), resulting with burned wood and burned flint within the archaeological layers. At Gesher Benot Ya'aqov, possible remnants of hearths are explored through analyses of the spatial distribution of burned flint-knapping waste products. These occur in dense clusters in each of the archaeological occupations throughout the long stratigraphic sequence. In this study, the combination between the spatial analyses results, paleoenvironmental information, and various factors involved in the complex process of fire ignition, combustion, and behavior, has enabled the firm rejection of recurrent natural fires as the responsible agent for the burned materials. In addition, it suggested that mainly at early sites, where evidence for burning is present yet scarce, data on fire ecology can be particularly useful when it is considered in relation to paleoenvironmental information.
Avanzo, Salvatore; Barbera, Roberto; de Mattia, Francesco; Rocca, Giuseppe La; Sorrentino, Mariapaola; Vicinanza, Domenico
ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) is a project coordinated at Conservatory of Music of Parma which aims to bring history to life. Ancient musical instruments can now be heard for the first time in hundreds of years, thanks to the successful synergy between art/humanities and science. The Epigonion, an instrument of the past, has been digitally recreated using gLite, an advanced middleware developed in the context of the EGEE project and research networks such as GÉANT2 in Europe and EUMEDCONNECT2 in the Mediterranean region. GÉANT2 and EUMEDCONNECT2, by connecting enormous and heterogeneous computing resources, provided the needed infrastructures to speed up the overall computation time and enable the computer-intensive modeling of musical sounds. This paper summarizes the most recent outcomes of the project underlining how the Grid aspect of the computation can support the Cultural Heritage community.
Hughes, Gareth J; Páez, Andrés; Bóshell, Jorge; Rupprecht, Charles E
2004-03-01
Historically, canine rabies in Colombia has been caused by two geographically distinct canine variants of rabies virus (RV) which between 1992 and 2002 accounted for approximately 95% of Colombian rabies cases. Genetic variant 1 (GV1) has been isolated up until 1997 in the Central Region and the Department of Arauca, and is now considered extinct through a successful vaccination program. Genetic variant 2 (GV2) has been isolated from the northern Caribbean Region and continues to circulate at present. Here we have analyzed two sets of sequence data based upon either a 147 nucleotide region of the glycoprotein (G) gene or a 258 nucleotide region that combines a fragment of the non-coding intergenic region and a fragment of the polymerase gene. Using both maximum likelihood (ML) and Markov chain Monte Carlo (MCMC) methods we have estimated the time of the most recent common ancestor (MRCA) of the two variants to be between 1983 and 1988. Reconstructions of the population history suggest that GV2 has been circulating in Colombia since the 1960s and that GV1 evolved as a separate lineage from GV2. Estimations of the effective population size at present show the GV2 outbreak to be approximately 20 times greater than that of GV1. Demographic reconstructions were unable to detect a decrease in population size concurrent with the elimination of GV1. We find a raised rate of nucleotide substitution for GV1 gene sequences when compared to that of GV2, although all estimates have wide confidence limits. We demonstrate that phylogenetic reconstructions and sequence analysis can be used to support incidence data from the field in the assessment of RV epidemiology. PMID:15019589
A number of proxies, including carbon to nitrogen ratio (C:N) and stable isotopes (δ13C and δ15N), have been used to reconstruct organic matter (OM) profiles from lake sediments and these proxies individually or in combination cannot clearly discriminate different sources. Here we present an alternative approach to elucidate this problem from lake sediments as a function of watershed scale land use changes. Stable isotope signatures of defined OM sources from the study watersheds, Shawnigan Lake (SHL) and Elk Lake (ELL), were compared with sedimentary proxy records. Results from this study reveal that terrestrial inputs and catchment soil coinciding with the watershed disturbances histories probably contributed in recent trophic enrichment in SHL. In contrast, cultural eutrophication in ELL was partially a result of input from catchment soil (agricultural activities) with significant input from lake primary production as well. Results were consistent in both IsoSource (IsoSource version 1.2 is a Visual Basic program used for source separation, ( (http://www.epa.gov/wed/pages/models/isosource/isosource.htm)) and discriminant analysis (statistical classification technique). - The study shows an alternative approach to reconstruct organic matter accumulation using stable isotopes from lake sediments
A general reconstruction of the recent expansion history of the universe
Vitenti, S D P
2015-01-01
Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary $(n+1)$-knots spline interpolation. We carry out a Monte Carlo analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and $H(z)$ data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most $10\\%$ of the total uncertainty. In...
Han, Y. M.; Wei, C.; Huang, R.-J.; Bandowe, B. A. M.; Ho, S. S. H.; Cao, J. J.; Jin, Z. D.; Xu, B. Q.; Gao, S. P.; Tie, X. X.; An, Z. S.; Wilcke, W.
2016-01-01
Historical reconstruction of atmospheric black carbon (BC, in the form of char and soot) is still constrained for inland areas. Here we determined and compared the past 150-yr records of BC and polycyclic aromatic compounds (PACs) in sediments from two representative lakes, Huguangyan (HGY) and Chaohu (CH), in eastern China. HGY only receives atmospheric deposition while CH is influenced by riverine input. BC, char, and soot have similar vertical concentration profiles as PACs in both lakes. Abrupt increases in concentrations and mass accumulation rates (MARs) of soot have mainly occurred since ~1950, the establishment of the People’s Republic of China, when energy usage changed to more fossil fuel contributions reflected by the variations in the concentration ratios of char/soot and individual PACs. In HGY, soot MARs increased by ~7.7 times in the period 1980-2012 relative to the period 1850-1950. Similar increases (~6.7 times) were observed in CH. The increase in soot MARs is also in line with the emission inventory records in the literature and the fact that the submicrometer-sized soot particles can be dispersed regionally. The study provides an alternative method to reconstruct the atmospheric soot history in populated inland areas.
Confederate Immigration to Brazil: A Cross-Cultural Approach to Reconstruction and Public History
Karina Esposito
2015-12-01
Full Text Available Given the interconnectedness of the contemporary world, it is imperative that historians place their studies within a global context, connecting domestic and foreign events in order to offer a thorough picture of the past. As historians, we should aim at exploring transnational connections in our published research and incorporating the same methodologies in the classroom, as well as in the field of Public History. Cross-cultural collaboration and transnational studies are challenging, but exceptionally effective approaches to developing a comprehensive understanding of the past and connecting people to their history. Important recent scholarship has placed the American Civil War in a broad international and transnational context. This article argues for the importance of continuing this trend, pointing to a unique case study: the confederate migration to Brazil during and after the Civil War. This episode can help us understand the international impact of the War in the western hemisphere. These confederates attempted to preserve some aspects of their Southern society by migrating to Brazil, one of the remaining slaveholding societies in the hemisphere at the time. Moreover, the descendants that remained in Brazil have engaged in a unique process of remembering and commemorating their heritage over the years. Exploring this migration will enhance Civil War and Reconstruction historiography, as well as commemoration, heritage and memory studies.
Pujolar, J M; Zane, L; Congiu, L
2012-06-01
The aim of our study is to examine the phylogenetic relationship, divergence times and demographic history of the five close-related Mediterranean and North-eastern Atlantic species/forms of Atherina using the full Bayesian framework for species tree estimation recently implemented in ∗BEAST. The inference is made possible by multilocus data using three mitochondrial genes (12S rRNA, 16S rRNA, control region) and one nuclear gene (rhodopsin) from multiple individuals per species available in GenBank. Bayesian phylogenetic analysis of the complete gene dataset produced a tree with strong support for the monophyly of each species, as well as high support for higher level nodes. An old origin of the Atherina group was suggested (19.2 MY), with deep split events within the Atherinidae predating the Messinian Salinity Crisis. Regional genetic substructuring was observed among populations of A. boyeri, with AMOVA and MultiDimensional Scaling suggesting the existence of five groupings (Atlantic/West Mediterranean, Adriatic, Greece, Black Sea and Tunis). The level of subdivision found might be consequence of the hydrographic isolation within the Mediterranean Sea. Bayesian inference of past demographic histories showed a clear signature of demographic expansion for the European coast populations of A. presbyter, possibly linked to post-glacial colonizations, but not for the Azores/Canary Islands, which is expected in isolated populations because of the impossibility of finding new habitats. Within the Mediterranean, signatures of recent demographic expansion were only found for the Adriatic population of A. boyeri, which could be associated with the relatively recent emergence of the Adriatic Sea. PMID:22425706
Jelinek, A. R.; Chemale, F., Jr.
2012-12-01
In this work we deal with the Phanerozoic history of the Southern Mantiqueira Province and adjacent areas after the orogen-collapse of the Brasiliano orogenic mountains in southern Brazil and Uruguay, based on thermocronological data (fission track and U-Th/He on apatite) and thermal history modelling. During the Paleozoic intraplate sedimentary basins formed mainly bordering the orogenic systems, and thus, these regions have not been overprinted by younger orogenic processes. In the Mesocenozoic this region was affected by later fragmentation and dispersal due to the separation of South America and Africa. Denudation history of both margins quantified on the basis of thermal history modeling of apatite fission track thermocronology indicates that the margin of southeastern Brazil and Uruguay presented a minimum 3.5 to 4.5 Km of denudation, which included the main exposure area of the Brasiliano orogenic belts and adjacent areas. The Phanerozoic evolution of the West Gondawana is thus recorded first by the orogenetic collapses of the Brasiliano and Pan-African belts, at that time formed a single mountain system in the Cambrian-Ordovician period. Subsequentlly, formed the intraplate basins as Paraná, in southeastern Brazil, and Congo and some records of the Table Mountains Group and upper section of Karoo units, in Southwestern Africa. In Permotriassic period, the collision of the Cape Fold Belt and Sierra de la Ventana Belt at the margins of the West Gondwana supercontinent resulted an elastic deformation in the cratonic areas, where the intraplate depositional basin occurred, and also subsidence and uplift of the already established Pan-African-Brasiliano Belts. Younger denudation events, due to continental margin uplift and basin subsidence, occurred during the rifting and dispersal of the South America and Africa plates, which can be very well defined by the integration of the passive-margin sedimentation of the Pelotas and Santos basins and apatite fission
Burnier, Yannis; Kaczmarek, Olaf; Rothkopf, Alexander
2016-01-01
We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.
We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 323 × Nτ (β = 7.0, ξ = 3.5) lattices with Nτ = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize Nf = 2 + 1 ASQTAD lattices with ml = ms/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints
Matthieu Vignes
Full Text Available Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth "Dialogue for Reverse Engineering Assessments and Methods" (DREAM5 challenges are aimed at assessing methods and associated algorithms devoted to the inference of biological networks. Challenge 3 on "Systems Genetics" proposed to infer causal gene regulatory networks from different genetical genomics data sets. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis of predicted networks in terms of structure and reliability. The developed meta-analysis was ranked first among the 16 teams participating in Challenge 3A. It paves the way for future extensions of our inference method and more accurate gene network estimates in the context of genetical genomics.
Burnier, Yannis; Rothkopf, Alexander
2014-01-01
We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched $32^3\\times N_\\tau$ $(\\beta=7.0,\\xi=3.5)$ lattices with $N_\\tau=24,...,96$, which cover $839{\\rm MeV} \\geq T\\geq 210 {\\rm MeV}$. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize $N_f=2+1$ ASQTAD lattices with $m_l=m_s/20$ by the HotQCD collaboration, giving access to temperatures between $286 {\\rm MeV} \\geq T\\geq 148{\\rm MeV}$. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinem...
Zhao, F.; Huang, C.; Zhu, Z.
2014-12-01
The Greater Yellowstone Ecosystem (GYE), located in Central Rocky Mountains of United States, is of complex ecological and land management histories along with different land ownerships. What are effects of the different land management practices (such as those by national parks vs. national forests) on ecosystem disturbances and carbon balance? We present here the methods and results of a study on forest disturbance history over the GYE from 1984 to 2010 reconstructed from Landsat time series stacks and local management records. Annual forest fire, harvest and other disturbances were tracked and separated by integrating a model called Vegetation Change Tracker and the Support Vector Machine algorithm. Local management records were separated into training and validation data for the disturbance maps. Area statistics and rates of disturbances were quantified and compared across GYE land ownership over the multi-decade period and interpreted for implications of these changes for forest management and carbon analysis. Our results indicate that during the study interval (1984 - 2010), GYE National Parks (NPs) and Wilderness Area (WA) had higher percentages of area of forests disturbed compared to GYE National Forests (NF). Within the GYE NPs, over 45% of the forest lands were disturbed at least once during the study period, the majority (37%) was by wildfire. For GYE wilderness area, the total disturbance was 30% of forest with 19.4% by wildfire and 10.6% by other disturbances. In Bridger-Teton NF, 14.7% of forest was disturbed and 3.6%, 0.5% and 10.6% of forest were disturbed by fire, harvest and other disturbances, respectively. For Caribou-Targhee NF, 25% of total forest was disturbed during this time interval and 1.5%, 6.4% and 17.1% of forest were disturbed by fire, harvest and other disturbances, respectively.
RECONSTRUCTING THE PHOTOMETRIC LIGHT CURVES OF EARTH AS A PLANET ALONG ITS HISTORY
By utilizing satellite-based estimations of the distribution of clouds, we have studied Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation, and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the landmass distributions were different from today's. With this information, we have been able to simulate the globally integrated photometric variability of the planet at these epochs. We find that our simple model reproduces well the observed cloud distribution and albedo variability of the modern Earth. Moreover, the model suggests that the photometric variability of the Earth was probably much larger in past epochs. This enhanced photometric variability could improve the chances for the difficult determination of the rotational period and the identification of continental landmasses for a distant planets.
Quantitative study on pollen-based reconstructions of vegetation history from central Canada
HART; Catherina; VETTER; Mary; SAUCHYN; David
2008-01-01
Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W, 40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem II. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.
Reconstructing the complex evolutionary history of mobile plasmids in red algal genomes.
Lee, JunMo; Kim, Kyeong Mi; Yang, Eun Chan; Miller, Kathy Ann; Boo, Sung Min; Bhattacharya, Debashish; Yoon, Hwan Su
2016-01-01
The integration of foreign DNA into algal and plant plastid genomes is a rare event, with only a few known examples of horizontal gene transfer (HGT). Plasmids, which are well-studied drivers of HGT in prokaryotes, have been reported previously in red algae (Rhodophyta). However, the distribution of these mobile DNA elements and their sites of integration into the plastid (ptDNA), mitochondrial (mtDNA), and nuclear genomes of Rhodophyta remain unknown. Here we reconstructed the complex evolutionary history of plasmid-derived DNAs in red algae. Comparative analysis of 21 rhodophyte ptDNAs, including new genome data for 5 species, turned up 22 plasmid-derived open reading frames (ORFs) that showed syntenic and copy number variation among species, but were conserved within different individuals in three lineages. Several plasmid-derived homologs were found not only in ptDNA but also in mtDNA and in the nuclear genome of green plants, stramenopiles, and rhizarians. Phylogenetic and plasmid-derived ORF analyses showed that the majority of plasmid DNAs originated within red algae, whereas others were derived from cyanobacteria, other bacteria, and viruses. Our results elucidate the evolution of plasmid DNAs in red algae and suggest that they spread as parasitic genetic elements. This hypothesis is consistent with their sporadic distribution within Rhodophyta. PMID:27030297
Bayesian phylogeography finds its roots.
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Burnier, Yannis [Institut de Théorie des Phénomènes Physiques, Ecole Polytechnique Fédérale de Lausanne, CH-1015, Lausanne (Switzerland); Kaczmarek, Olaf [Fakultät für Physik, Universität Bielefeld, D-33615 Bielefeld (Germany); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, D-69120 Heidelberg (Germany)
2016-01-22
We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 32{sup 3} × N{sub τ} (β = 7.0, ξ = 3.5) lattices with N{sub τ} = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize N{sub f} = 2 + 1 ASQTAD lattices with m{sub l} = m{sub s}/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.
Giulia Carreras
2012-09-01
Full Text Available
Background: parameter uncertainty in the Markov model’s description of a disease course was addressed. Probabilistic sensitivity analysis (PSA is now considered the only tool that properly permits parameter uncertainty’s examination. This consists in sampling values from the parameter’s probability distributions.
Methods: Markov models fitted with microsimulation were considered and methods for carrying out a PSA on transition probabilities were studied. Two Bayesian solutions were developed: for each row of the modeled transition matrix the prior distribution was assumed as a product of Beta or a Dirichlet. The two solutions differ in the source of information: several different sources for each transition in the Beta approach and a single source for each transition from a given health state in the Dirichlet. The two methods were applied to a simple cervical cancer’s model.
Results : differences between posterior estimates from the two methods were negligible. Results showed that the prior variability highly influence the posterior distribution.
Conclusions: the novelty of this work is the Bayesian approach that integrates the two distributions with a product of Binomial distributions likelihood. Such methods could be also applied to cohort data and their application to more complex models could be useful and unique in the cervical cancer context, as well as in other disease modeling.
Significance of "stretched" mineral inclusions for reconstructing P- T exhumation history
Ashley, Kyle T.; Darling, Robert S.; Bodnar, Robert J.; Law, Richard D.
2015-06-01
Analysis of mineral inclusions in chemically and physically resistant hosts has proven to be valuable for reconstructing the P- T exhumation history of high-grade metamorphic rocks. The occurrence of cristobalite-bearing inclusions in garnets from Gore Mountain, New York, is unexpected because the peak metamorphic conditions reached are well removed (>600 °C too cold) from the stability field of this low-density silica polymorph that typically forms in high temperature volcanic environments. A previous study of samples from this area interpreted polymineralic inclusions consisting of cristobalite, albite and ilmenite as representing crystallized droplets of melt generated during a garnet-in reaction, followed by water loss from the inclusion to explain the reduction in inclusion pressure that drove the transformation of quartz to cristobalite. However, the recent discovery of monomineralic inclusions of cristobalite from the nearby Hooper Mine cannot be explained by this process. For these inclusions, we propose that the volume response to pressure and temperature changes during exhumation to Earth's surface resulted in large tensile stresses within the silica phase that would be sufficient to cause transformation to the low-density (low-pressure) form. Elastic modeling of other common inclusion-host systems suggests that this quartz-to-cristobalite example may not be a unique case. The aluminosilicate polymorph kyanite also has the capacity to retain tensile stresses if exhumed to Earth's surface after being trapped as an inclusion in plagioclase at P- T conditions within the kyanite stability field, with the stresses developed during exhumation sufficient to produce a transformation to andalusite. These results highlight the elastic environment that may arise during exhumation and provide a potential explanation of observed inclusions whose stability fields are well removed from P- T paths followed during exhumation.
Reconstructing the Solar Wind from Its Early History to Current Epoch
Airapetian, Vladimir S.; Usmanov, Arcadi V.
2016-02-01
Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.
Amaya Gorostiza
Full Text Available The study of genetic information can reveal a reconstruction of human population's history. We sequenced the entire mtDNA control region (positions 16.024 to 576 following Cambridge Reference Sequence, CRS of 605 individuals from seven Mesoamerican indigenous groups and one Aridoamerican from the Greater Southwest previously defined, all of them in present Mexico. Samples were collected directly from the indigenous populations, the application of an individual survey made it possible to remove related or with other origins samples. Diversity indices and demographic estimates were calculated. Also AMOVAs were calculated according to different criteria. An MDS plot, based on FST distances, was also built. We carried out the construction of individual networks for the four Amerindian haplogroups detected. Finally, barrier software was applied to detect genetic boundaries among populations. The results suggest: a common origin of the indigenous groups; a small degree of European admixture; and inter-ethnic gene flow. The process of Mesoamerica's human settlement took place quickly influenced by the region's orography, which development of genetic and cultural differences facilitated. We find the existence of genetic structure is related to the region's geography, rather than to cultural parameters, such as language. The human population gradually became fragmented, though they remained relatively isolated, and differentiated due to small population sizes and different survival strategies. Genetic differences were detected between Aridoamerica and Mesoamerica, which can be subdivided into "East", "Center", "West" and "Southeast". The fragmentation process occurred mainly during the Mesoamerican Pre-Classic period, with the Otomí being one of the oldest groups. With an increased number of populations studied adding previously published data, there is no change in the conclusions, although significant genetic heterogeneity can be detected in Pima and
Reconstructing the geological history of the Egyptian Nile. Aswan - Kom Ombo phase
Complete text of publication follows. The Nile is the longest river in the world, stretching north for approximately 4,000 miles from East Africa to the Mediterranean. Over the past several millions of years the Nile gradually has changed its location and size. Reconstructing the geological history of the Nile and identifying the location of abandoned and now buried paleo-channels and deltas, is an essential step in constructing a land use maps. An initial study area between Aswan and Kom Ombo, Egypt was selected for a geologic and geophysical field survey supported with interpretation of Landsat TM, ASTER and radar SIR-C/X-SAR images. Simultaneously, gravity and magnetotelluric data were acquired along two traverses; one following Wadi Abu Subbaira, east of the Nile while the other one across the Wadi Kubania pre-Nile drainage system, to the west. Gravity data were collected using a Scintrex CG-5 gravimeter and a differential GPS whereas the magnetotelluric data were collected using a controlled source audio magnetotelluric stratagem system. Integration of geologic field mapping, geophysical investigations, and interpretation of different types of remote sensing images were used to construct an improved geological and structural map of the study area. The constructed map reveals; 1- This area is strongly controlled by NE-SW, NW-SE and N-S trending basement structures, largely faults; 2- The evolution of these distinct fault sets was largely controlled by the Red Sea tectonics which started ∼22 Ma ago. Separation of the Arabian plate from the African plate provided NE-SW extension which subsequently resulted in the development of deep NW-SE grabens (e.g. the Kubania graben) where strain was closely localized because of the presence of older NW-SE trending Precambrian structures; 3-There is a prominent pre- Nile drainage system dominated by W- and NW- drainages emerging from the uplifted Red Sea Hills prior to opening of the Red Sea; 4- Wadi Abu Subbaira
Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years
Muir Wood, R. [EQE International Ltd (United Kingdom)
1995-12-01
In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.
Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years
In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can most readily be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume 'push'. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10-9/year. Within the Baltic Shield long term strain rates have been around 10-11/year, with little evidence for evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently very little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the 'Current Tectonic Regime' is of Quaternary age although the orientation of the major stress axis has remained approximately consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than
A unique opportunity to reconstruct the volcanic history of the island of Nevis, Lesser Antilles
Saginor, I.; Gazel, E.
2012-12-01
We report twelve new ICP-MS analyses and two 40Ar/39Ar ages for the Caribbean island of Nevis, located in the Lesser Antilles. These data show a very strong fractionation trend, suggesting that along strike variations may be primarily controlled by the interaction of rising magma with the upper plate. If this fractionation trend is shown to correlate with age, it may suggest that underplating of the crust is responsible for variations in the makeup of erupted lava over time, particularly with respect to silica content. We have recently been given permission to sample a series of cores being drilled by a geothermal company with the goal of reconstructing the volcanic history of the island. Drilling is often cost-prohibitive, making this a truly unique opportunity. Nevis has received little recent attention from researchers due to the fact that it has not been active for at least 100,000 years and also because of its proximity to the highly active Montserrat, which boasts its very own volcano observatory. However, there are a number of good reasons that make this region and Nevis in particular an ideal location for further analysis. First, and most importantly, is the access to thousands of meters of drill cores that is being provided by a local geothermal company. Second, a robust earthquake catalog exists (Bengoubou-Valerius et al., 2008), so the dip and depth to the subducting slab is well known. These are fundamental parameters that influence the mechanics of a subduction zone, therefore it would be difficult to proceed if they were poorly constrained. Third, prior sampling of Nevis has been limited since Hutton and Nockolds (1978) published the only extensive petrologic study ever performed on the island. This paper contained only 43 geochemical analyses and 6 K-Ar ages, which are less reliable than more modern Ar-Ar ages. Subsequent studies tended to focus on water geochemistry (GeothermEx, 2005), geothermal potential (Geotermica Italiana, 1992; Huttrer, 1998
Inverse method for reconstruction of ground surface temperature history from borehole temperatures
Šafanda, Jan; Correia, A.; Majorowicz, J.; Rajver, D.
Matsuyama, 2003 - (Yamano, M.; Nagao, T.; Sweda, T.), s. 163-178 [ Geothermal /dendrochronological paleoclimate reconstruction across eastern margin of Eurasia. Matsuyama (JP), 28.11.2002-30.11.2002] R&D Projects: GA AV ČR IAA3012005 Institutional research plan: CEZ:AV0Z3012916 Keywords : borehole temperature profiles * inversion method * climate reconstruction Subject RIV: DC - Siesmology, Volcanology, Earth Structure
Cui, Qiaoyu; Gaillard, Marie-José; Lemdahl, Geoffrey; Olsson, Fredrik; Sugita, Shinya
2010-05-01
Quantitative reconstruction of past vegetation using fossil pollen was long very problematic. It is well known that pollen percentages and pollen accumulation rates do not represent vegetation abundance properly because pollen values are influenced by many factors of which inter-taxonomic differences in pollen productivity and vegetation structure are the most important ones. It is also recognized that pollen assemblages from large sites (lakes or bogs) record the characteristics of the regional vegetation, while pollen assemblages from small sites record local features. Based on the theoretical understanding of the factors and mechanisms that affect pollen representation of vegetation, Sugita (2007a and b) proposed the Landscape Reconstruction Algorithm (LRA) to estimate vegetation abundance in percentage cover for well defined spatial scales. The LRA includes two models, REVEALS and LOVE. REVEALS estimates regional vegetation abundance at a spatial scale of 100 km x 100 km. LOVE estimates local vegetation abundance at the spatial scale of the relevant source area of pollen (RSAP sensu Sugita 1993) of the pollen site. REVEALS estimates are needed to apply LOVE in order to calculate the RSAP and the vegetation cover within the RSAP. The two models were validated theoretically and empirically. Two small bogs in southern Sweden were studied for pollen, plant macrofossil, charcoal, and coleoptera in order to reconstruct the local Holocene forest and fire history (e.g. Greisman and Gaillard 2009; Olsson et al. 2009). We applied the LOVE model in order to 1) compare the LOVE estimates with pollen percentages for a better understanding of the local forest history; 2) obtain more precise information on the local vegetation to explain between-sites differences in fire history. We used pollen records from two large lakes in Småland to obtain REVEALS estimates for twelve continuous 500-yrs time windows. Following the strategy of the Swedish VR LANDCLIM project (see Gaillard
Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course
Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland
2012-01-01
The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…
Hiroshi Saito
2014-03-01
Full Text Available The decision making behaviors of humans and animals adapt and then satisfy an ``operant matching law'' in certain type of tasks. This was first pointed out by Herrnstein in his foraging experiments on pigeons. The matching law has been one landmark for elucidating the underlying processes of decision making and its learning in the brain. An interesting question is whether decisions are made deterministically or probabilistically. Conventional learning models of the matching law are based on the latter idea; they assume that subjects learn choice probabilities of respective alternatives and decide stochastically with the probabilities. However, it is unknown whether the matching law can be accounted for by a deterministic strategy or not. To answer this question, we propose several deterministic Bayesian decision making models that have certain incorrect beliefs about an environment. We claim that a simple model produces behavior satisfying the matching law in static settings of a foraging task but not in dynamic settings. We found that the model that has a belief that the environment is volatile works well in the dynamic foraging task and exhibits undermatching, which is a slight deviation from the matching law observed in many experiments. This model also demonstrates the double-exponential reward history dependency of a choice and a heavier-tailed run-length distribution, as has recently been reported in experiments on monkeys.
Reconstruction of bomb 14C time history recorded in the stalagmite from Postojna Cave
The karstic caves provide valuable resources for reconstruction of environmental conditions on the continent in the past. This is possible due to the great stability of climatic conditions within a cave. Secondary minerals deposited in caves, known as speleothems, preserve records of long-term climatic and environmental changes at the site of their deposition and in the vicinity. The purity of speleothems and their chemical and physical stability make them exceptionally well suited for detailed geochemical and isotopic analysis
Brescó, Ignacio
2016-01-01
Innis’ and Brinkmann’s papers (this issue) tackle two key aspects in cultural psychology: the mediating role played by the different systems of meanings throughout history in making sense of the world, and the normative role of those systems, including psychology itself. This paper offers a...... reflection on these two issues. It begins by highlighting the contribution of psychology and history, as emerging disciplines in the 19th Century, to the creation of a normative framework for the subject of modernity according to the needs of modern nation states. It also alludes to both disciplines’ common...... pursuit of a reference point in natural science in order to achieve a status that is on a par with the latter’s. It is argued that this resulted in an objectivist stance that equates the study of memory and history with an accurate reproduction of the past, thus concealing the mediated nature of past...
Sdrolias, M.; Müller, R.
2006-05-01
The South American-Antarctic margin has been characterised by numerous episodes of volcanic arc activity and ore deposit formation throughout much of the Mesozoic and Cenozoic. Although its Cenozoic subduction history is relatively well known, placing the Mesozoic arc-related volcanics and the emplacement of ore bodies in their plate tectonic context remains poorly constrained. We use a merged moving hotspot (Late Cretaceous- present) and palaeomagnetic /fixed hotspot (Early Cretaceous) reference frame, coupled with reconstructed spreading histories of the Pacific, Phoenix and Farallon plates to understand the convergence history of the South American and Antarctic margins. We compute the age-area distribution of oceanic lithosphere through time, including subducting oceanic lithosphere and estimate convergence rates along the margin. Additionally, we map the location and migration of spreading ridges along the margin and relate this to processes on the overriding plate. The South American-Antarctic margin in the late Jurassic-early Cretaceous was dominated by rapid convergence, the subduction of relatively young oceanic lithosphere (Rocas Verdes" in southern South America. The speed of subduction increased again along the South American-Antarctic margin at ~105 Ma after another change in tectonic regime. Newly created crust from the Farallon-Phoenix ridge continued to be subducted along southern South America until the cessation of the Farallon-Phoenix ridge in the latest Cretaceous / beginning of the Cenozoic. The age of the subducting oceanic lithosphere along the South American-Antarctic margin has increased steadily through time.
Reconstruction of exposure histories of meteorites from Antarctica and the Sahara
10Be, 14C, and 26Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs
Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten
2012-01-01
3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the
Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course
Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland
2011-04-01
The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth centuries. This paper deals with our experiences in restaging the decomposition of water as part of a history of science course at the Norwegian University of Science and Technology, Trondheim, Norway. For the experiment we used an apparatus from our historical teaching collection and built a replica of a voltaic pile. We also traced the uses and meanings of decomposition of water within science and science teaching in schools and higher education in local institutions. Building the pile, and carrying out the experiments, held a few surprises that we did not anticipate through our study of written sources. The exercise gave us valuable insight into the nature of the devices and the experiment, and our students appreciated an experience of a different kind in a history of science course.
Reconstruction of exposure histories of meteorites from Antarctica and the Sahara
Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)
1997-09-01
{sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
In this paper, we consider a simple form of expansion history of Universe referred to as the hybrid expansion law - a product of power-law and exponential type of functions. The ansatz by construction mimics the power-law and de Sitter cosmologies as special cases but also provides an elegant description of the transition from deceleration to cosmic acceleration. We point out the Brans-Dicke realization of the cosmic history under consideration. We construct potentials for quintessence, phantom and tachyon fields, which can give rise to the hybrid expansion law in general relativity. We investigate observational constraints on the model with hybrid expansion law applied to late time acceleration as well as to early Universe a la nucleosynthesis
Pasquato, Mario; Chung, Chul
2016-01-01
Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We ...
Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances
Snaith, O; Di Matteo, P; Lehnert, M D; Combes, F; Katz, D; Gómez, A
2014-01-01
We develop a chemical evolution model in order to study the star formation history of the Milky Way. Our model assumes that the Milky Way is formed from a closed box-like system in the inner regions, while the outer parts of the disc experience some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) in order to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age in order to recover the star formation history of the Galaxy. Our method enables one to recover with unprecedented accuracy the star formation history of the Milky Way in the first Gyrs, in both the inner (R9-10kpc) discs as sampled in the solar vicinity. We show that, in the inner disc, half of the stellar mass formed during the thick disc phase, in the first 4-5 Gyr. This phase was followed by a significant dip in the star formation activity (at 8-9 Gyr) and a period of roughly constant lower level star formation for the remaining 8 Gyr. The thick disc phase ha...
Feng, Chao-Jun
2016-01-01
To probe the late evolution history of the Universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis (PCA) and the other is build by taking the multidimensional scaling (MDS) approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis are optimized for different kinds of cosmological models that based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that projected from the basis systems is cosmological model independent, and it provide a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray bursts (GRBs) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt (LM) technique and the Markov Chain Monte Carlo (MCMC) method, we perform an ...
Reconstructing the Solar Wind From Its Early History To Current Epoch
Airapetian, Vladimir S
2016-01-01
Stellar winds from active solar type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass loss rates cannot be directly derived from observations. We employed a three dimensional magnetohydrodynamic Alfven wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass loss rate, terminal velocity and wind temperature at 0.7, 2 and 4.65 Gyr. Our model treats the wind thermal electrons, protons and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfven wave amplitude and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constra...
Demographic History of the Genus Pan Inferred from Whole Mitochondrial Genome Reconstructions
Tucci, Serena; de Manuel, Marc; Ghirotto, Silvia; Benazzo, Andrea; Prado-Martinez, Javier; Lorente-Galdos, Belen; Nam, Kiwoong; Dabad, Marc; Hernandez-Rodriguez, Jessica; Comas, David; Navarro, Arcadi; Schierup, Mikkel H.; Andres, Aida M.; Barbujani, Guido; Hvilsom, Christina; Marques-Bonet, Tomas
2016-01-01
The genus Pan is the closest genus to our own and it includes two species, Pan paniscus (bonobos) and Pan troglodytes (chimpanzees). The later is constituted by four subspecies, all highly endangered. The study of the Pan genera has been incessantly complicated by the intricate relationship among subspecies and the statistical limitations imposed by the reduced number of samples or genomic markers analyzed. Here, we present a new method to reconstruct complete mitochondrial genomes (mitogenomes) from whole genome shotgun (WGS) datasets, mtArchitect, showing that its reconstructions are highly accurate and consistent with long-range PCR mitogenomes. We used this approach to build the mitochondrial genomes of 20 newly sequenced samples which, together with available genomes, allowed us to analyze the hitherto most complete Pan mitochondrial genome dataset including 156 chimpanzee and 44 bonobo individuals, with a proportional contribution from all chimpanzee subspecies. We estimated the separation time between chimpanzees and bonobos around 1.15 million years ago (Mya) [0.81–1.49]. Further, we found that under the most probable genealogical model the two clades of chimpanzees, Western + Nigeria-Cameroon and Central + Eastern, separated at 0.59 Mya [0.41–0.78] with further internal separations at 0.32 Mya [0.22–0.43] and 0.16 Mya [0.17–0.34], respectively. Finally, for a subset of our samples, we compared nuclear versus mitochondrial genomes and we found that chimpanzee subspecies have different patterns of nuclear and mitochondrial diversity, which could be a result of either processes affecting the mitochondrial genome, such as hitchhiking or background selection, or a result of population dynamics. PMID:27345955
Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances
Snaith, O.; Haywood, M.; Di Matteo, P.; Lehnert, M. D.; Combes, F.; Katz, D.; Gómez, A.
2015-06-01
We develop a chemical evolution model to study the star formation history of the Milky Way. Our model assumes that the Milky Way has formed from a closed-box-like system in the inner regions, while the outer parts of the disc have experienced some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age to recover the star formation history of the Galaxy. Our method enables us to recover the star formation history of the Milky Way in the first Gyrs with unprecedented accuracy in the inner (R 9-10 kpc) discs, as sampled in the solar vicinity. We show that half the stellar mass formed during the thick-disc phase in the inner galaxy during the first 4-5 Gyr. This phase was followed by a significant dip in star formation activity (at 8-9 Gyr) and a period of roughly constant lower-level star formation for the remaining 8 Gyr. The thick-disc phase has produced as many metals in 4 Gyr as the thin-disc phase in the remaining 8 Gyr. Our results suggest that a closed-box model is able to fit all the available constraints in the inner disc. A closed-box system is qualitatively equivalent to a regime where the accretion rate maintains a high gas fraction in the inner disc at high redshift. In these conditions the SFR is mainly governed by the high turbulence of the interstellar medium. By z ~ 1 it is possible that most of the accretion takes place in the outer disc, while the star formation activity in the inner disc is mostly sustained by the gas that is not consumed during the thick-disc phase and the continuous ejecta from earlier generations of stars. The outer disc follows a star formation history very similar to that of the inner disc, although initiated at z ~ 2, about 2 Gyr before the onset of the thin-disc formation in the inner disc.
A Blacker and Browner Shade of Pale: Reconstructing Punk Rock History
Pietschmann, Franziska
2010-01-01
Embedded in the transatlantic history of rock ‘n’ roll, punk rock has not only been regarded as a watershed moment in terms of music, aesthetics and music-related cultural practices, it has also been perceived as a subversive white cultural phenomenon. A Blacker and Browner Shade of Pale challenges this widespread and shortsighted assumption. People of color, particularly black Americans and Britons, and Latina/os have pro-actively contributed to punk’s evolution and shaped punk music culture...
Z. Y. Wu
2011-09-01
Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North
Testing sex and gender in sports; reinventing, reimagining and reconstructing histories.
Heggie, Vanessa
2010-12-01
Most international sports organisations work on the premise that human beings come in one of two genders: male or female. Consequently, all athletes, including intersex and transgender individuals, must be assigned to compete in one or other category. Since the 1930s (not, as is popularly suggested, the 1960s) these organisations have relied on scientific and medical professionals to provide an 'objective' judgement of an athlete's eligibility to compete in women's national and international sporting events. The changing nature of these judgements reflects a great deal about our cultural, social and national prejudices, while the matter of testing itself has become a site of conflict for feminists and human rights activists. Because of the sensitive nature of this subject, histories of sex testing are difficult to write and research; this has lead to the repetition of inaccurate information and false assertions about gender fraud, particularly in relation to the 'classic' cases of Stella Walsh and Heinrich/Hermann/Dora Ratjen. As historians, we need to be extremely careful to differentiate between mythologies and histories. PMID:20980057
Rosenberg-Wohl, David Michael
2014-01-01
AbstractReconstructing Jewish Identity on the Foundations of Hellenistic History:Azariah de' Rossi's Me'or `Enayim in Late 16th Century Northern ItalybyDavid Michael Rosenberg-WohlDoctor of Philosophy in Jewish Studiesandthe Graduate Theological UnionProfessor Erich S. Gruen, ChairMe'or `Enayim is conventionally considered to be early modern Jewish history. Recent scholarship tends to consider the work Renaissance historiography, Counter-Reformation apology or some combination of the two. The...
Reconstructing the history of 14C discharges from Sellafield: Part 1--atmospheric discharges
14C specific activities, above ambient background levels, were determined in individual tree-rings (corresponding to the years 1950-1999) sectioned from an oak tree that was felled in autumn 1999, from a location 1.5 km east of the Sellafield nuclear fuel reprocessing plant in Cumbria, north-west England. The data were used to produce a new, improved, reconstruction of Sellafield's annual atmospheric 14C discharges between 1951 and 1999, using the most reliable discharge data set (1994-1999) as the primary basis for the determination of a new calibration factor that relates excess 14C activity in individual tree rings to the annual discharge during the corresponding year. The results indicate that the current British Nuclear Fuels plc (BNFL) estimate of total 14C discharges to the atmosphere prior to 1978 is significantly overestimated, while the current estimate of total 14C discharges after 1978 is very similar to that determined in this study. In this study, the total activity of 14C discharged to the atmosphere from Sellafield between 1951 and 1999 is estimated to be 259±63 TBq (at 2 std. dev.). The BNFL current estimate is 360 TBq
Smith, Oliver; Clapham, Alan; Rose, Pam; Liu, Yuan; Wang, Jun; Allaby, Robin G
2014-01-01
The origins of many plant diseases appear to be recent and associated with the rise of domestication, the spread of agriculture or recent global movements of crops. Distinguishing between these possibilities is problematic because of the difficulty of determining rates of molecular evolution over short time frames. Heterochronous approaches using recent and historical samples show that plant viruses exhibit highly variable and often rapid rates of molecular evolution. The accuracy of estimated evolution rates and age of origin can be greatly improved with the inclusion of older molecular data from archaeological material. Here we present the first reconstruction of an archaeological RNA genome, which is of Barley Stripe Mosaic Virus (BSMV) isolated from barley grain ~750 years of age. Phylogenetic analysis of BSMV that includes this genome indicates the divergence of BSMV and its closest relative prior to this time, most likely around 2000 years ago. However, exclusion of the archaeological data results in an apparently much more recent origin of the virus that postdates even the archaeological sample. We conclude that this viral lineage originated in the Near East or North Africa, and spread to North America and East Asia with their hosts along historical trade routes. PMID:24499968
Regional reconstruction of flash flood history in the Guadarrama range (Central System, Spain).
Rodriguez-Morata, C; Ballesteros-Cánovas, J A; Trappmann, D; Beniston, M; Stoffel, M
2016-04-15
Flash floods are a common natural hazard in Mediterranean mountain environments and responsible for serious economic and human disasters. The study of flash flood dynamics and their triggers is a key issue; however, the retrieval of historical data is often limited in mountain regions as a result of short time series and the systematic lack of historical data. In this study, we attempt to overcome data deficiency by supplementing existing records with dendrogeomorphic techniques which were employed in seven mountain streams along the northern slopes of the Guadarrama Mountain range. Here we present results derived from the tree-ring analysis of 117 samples from 63 Pinus sylvestris L. trees injured by flash floods, to complement existing flash flood records covering the last ~200years and comment on their hydro-meteorological triggers. To understand the varying number of reconstructed flash flood events in each of the catchments, we also performed a comparative analysis of geomorphic catchment characteristics, land use evolution and forest management. Furthermore, we discuss the limitations of dendrogeomorphic techniques applied in managed forests. PMID:26845178
van Eaton, A. R.; Zimmerman, A.; Brenner, M.; Kenney, W.; Jaeger, J. M.
2006-12-01
Historical reconstructions of aquatic systems have commonly depended on short-lived radioisotopes (e.g. Pb- 210 and Cs-137) to provide a temporal framework for disturbances over the past 100 years. However, applications of these radiotracers to highly variable systems such as estuaries are often problematic. Hydrologic systems prone to rapid shifts in sediment composition and grain size distribution may yield low and erratic isotopic activities with depth in sediment. Additionally, the marine influence on coastal systems and preferential adsorption of radionuclides by organic matter may violate assumptions of the CIC and CRS dating models. Whereas these sediment cores are often deemed "undateable", we propose a modeling technique that accounts for textural and compositional variation, providing insight into the depositional patterns and disturbance records of these dynamic environments. Here, the technique is applied to sediment cores collected from five regions of Naples Bay estuary in southwest Florida. The significant positive correlation between excess Pb-210 activities and organic matter content in each core provides evidence for strong lithologic control on radioisotope scavenging, supporting the use of organic matter- normalized excess Pb-210 activity profiles when modeling sediment accumulation rates in predominantly sandy estuaries. Using this approach, episodes of increased sedimentation rate were established that correspond to periods of heightened anthropogenic disturbance (canal dredging and development) in the Naples Bay watershed during the mid- 1900's.
Stawinski, G
1998-10-26
Bayesian algorithms are developed to solve inverse problems in gamma imaging and photofission tomography. The first part of this work is devoted to the modeling of our measurement systems. Two models have been found for both applications: the first one is a simple conventional model and the second one is a cascaded point process model. EM and MCMC Bayesian algorithms for image restoration and image reconstruction have been developed for these models and compared. The cascaded point process model does not improve significantly the results previously obtained by the classical model. To original approaches have been proposed, which increase the results previously obtained. The first approach uses an inhomogeneous Markov Random Field as a prior law, and makes the regularization parameter spatially vary. However, the problem of the estimation of hyper-parameters has not been solved. In the case of the deconvolution of point sources, a second approach has been proposed, which introduces a high level prior model. The picture is modeled as a list of objects, whose parameters and number are unknown. The results obtained with this method are more accurate than those obtained with the conventional Markov Random Field prior model and require less computational costs. (author)
How to make judicious use of current physics in reconstructing its history
Janssen, Michel
2013-04-01
Using three concrete examples, I illustrate both benefits and pitfalls of approaching the history of relativity and quantum theory with current textbook knowledge of these subjects. First, I show how knowing something about energy-momentum tensors in special relativity makes it easy to see that special relativity did not, as has been suggested, simply kill the program of Abraham and others at the beginning of the 20th century to reduce all of physics to electrodynamics, but co-opted key elements of it. Second, I show how knowing something about coordinate conditions in general relativity can be an obstacle to seeing why Einstein initially rejected field equations based on the Ricci tensor. Third, I show how knowing something about Hilbert space can be an obstacle to seeing the logic behind Jordan's statistical transformation theory. These three examples suggest that knowledge of modern physics is beneficial for historians, but only when used judiciously.
Pasquato, Mario
2016-01-01
Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After dimensionality reduction on the feature space, the resulting datapoints are fed to various classification algorithms. Using repeated random subsampling validation we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results. The three algorithms we considered (C5.0 tree...
Mahamud, Kira; Martínez Ruiz-Funes, María José
2014-01-01
This paper describes a study dealing with the reconstruction of the lives of two Spanish primary school teachers during the Franco dictatorship (1939-1975), in order to learn to what extent such a field of research can contribute to the history of education. Two family archives provide extraordinary and unique documentation to track down their…
Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor
2005-01-01
The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…
Jiang, Xu; Deng, Yong; Luo, Zhaoyang; Luo, Qingming
2015-10-01
The excessive time required by fluorescence diffuse optical tomography (fDOT) image reconstruction based on path-history fluorescence Monte Carlo model is its primary limiting factor. Herein, we present a method that accelerates fDOT image reconstruction. We employ three-level parallel architecture including multiple nodes in cluster, multiple cores in central processing unit (CPU), and multiple streaming multiprocessors in graphics processing unit (GPU). Different GPU memories are selectively used, the data-writing time is effectively eliminated, and the data transport per iteration is minimized. Simulation experiments demonstrated that this method can utilize general-purpose computing platforms to efficiently implement and accelerate fDOT image reconstruction, thus providing a practical means of using path-history-based fluorescence Monte Carlo model for fDOT imaging. PMID:26480115
Reconstruction of the paleothermal history of the sub-basin, Lower Guajira, Colombia
The paleothermal history of the lower Guajira sub-basin was achieved using three methods: 1) calculation of the presents geothermal gradient and heat flow, 2) vitrinite reflectance and, 3) fission track analysis on apatites and zircon grains. New analytical data of vitrinite reflectance and fission tracks allowed identifying four thermal events with the following features: the oldest thermal event took place during the late cretaceous between 95 and 65 ma. The second thermal event occurred during the late Eocene between 40 and 35 ma. The third thermal event occurred in the early and middle Miocene between 22 and 15 ma. The fourth and last thermal event took place in the late Miocene between 10 and 5 ma. The cooling events match with four unconformities previously identified: 1) the late cretaceous unconformity at the top of the Guaralamai Formation, 2) the late Eocene unconformity at the base of the Siamana Formation 3) the early to middle Miocene unconformity on the top of the Siamana Formation, and 4) the late Miocene unconformity on the top of the Castilletes Formation.
Pasquato, Mario; Chung, Chul
2016-05-01
Context. Machine-learning (ML) solves problems by learning patterns from data with limited or no human guidance. In astronomy, ML is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims: We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify globular clusters (GCs) that may have a history of merging from observational data. Methods: We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After carrying out dimensionality reduction on the feature space, the resulting datapoints are fed in to various classification algorithms. Using repeated random subsampling validation, we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results: The three algorithms we considered (C5.0 trees, k-nearest neighbour, and support-vector machines) all achieve a test misclassification rate of about 10% without parameter tuning, with support-vector machines slightly outperforming the others. The first principal component of feature space correlates with cluster concentration. If we exclude it from the regression, the performance of the algorithms is only slightly reduced.
The reconstructive study in arcaheology: case histories in the communication issues
Francesco Gabellone
2011-09-01
Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l
Feng, Chao-Jun; Li, Xin-Zhou
2016-04-01
To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.
Brunschön, Corinna; Behling, Hermann
2009-11-01
The last ca. 20,000 yr of palaeoenvironmental conditions in Podocarpus National Park in the southeastern Ecuadorian Andes have been reconstructed from two pollen records from Cerro Toledo (04°22'28.6"S, 79°06'41.5"W) at 3150 m and 3110 m elevation. Páramo vegetation with high proportions of Plantago rigida characterised the last glacial maximum (LGM), reflecting cold and wet conditions. The upper forest line was at markedly lower elevations than present. After ca. 16,200 cal yr BP, páramo vegetation decreased slightly while mountain rainforest developed, suggesting rising temperatures. The trend of increasing temperatures and mountain rainforest expansion continued until ca. 8500 cal yr BP, while highest temperatures probably occurred from 9300 to 8500 cal yr BP. From ca. 8500 cal yr BP, páramo vegetation re-expanded with dominance of Poaceae, suggesting a change to cooler conditions. During the late Holocene after ca. 1800 cal yr BP, a decrease in páramo indicates a change to warmer conditions. Anthropogenic impact near the study site is indicated for times after 2300 cal yr BP. The regional environmental history indicates that through time the eastern Andean Cordillera in South Ecuador was influenced by eastern Amazonian climates rather than western Pacific climates.
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Müller, S.; Tarasov, P. E.; Andreev, A. A.; Diekmann, B.
2009-04-01
The study presented here is part of the IPY project 106 "Lake Records of late Quaternary Climate Variability in northeastern Siberia" and the German Research Foundation project RI 809/17-1,2 "Late Quaternary environmental history of interstadial and interglacial periods in the Arctic reconstructed from bioindicators in permafrost sequences in NE Siberia". Both projects focus on generating high-resolution vegetation and climate proxy records mainly from lacustrine sediments along a north-south transect from Yakutia, Republic of Russia. This region is known for its climate extremes, with the Verkhoyansk Mountain Range being the coldest area in the Northern Hemisphere - "Pole of Cold". Radiocarbon-dated pollen records from Lake Billyakh (65°17'N, 126°47'E; 340 m a.s.l.) located in the central part of the Verkhoyansk Mountains were used to reconstruct vegetation and climate changes. The longest and oldest sediment core from the lake reaches back to >30 kyr BP, thus covering the last two Late Pleistocene Interstadials in Siberia. The pollen record and pollen-based biome reconstruction of the core PG 1756, which covers the last 15 kyr BP, suggest that open cool steppe and grass and sedge tundra communities with Poaceae, Cyperaceae, Artemisia, Chenopodiaceae, Caryophyllaceae and Selaginella rupestris dominated the area from 15 to 13.5 kyr BP. On the other hand, the constant presence of Larix pollen in quantities comparable to today's values points to the constant presence of boreal deciduous conifer trees in the regional vegetation during the last glaciation. A major spread of shrub tundra communities, including birch (Betula sect. Nanae), alder (Duschekia fruticosa) and willow (Salix) species, is dated to 13.5-12.7 kyr BP, indicating a noticeable increase in precipitation toward the end of the last glaciation, particularly during the Allerød Interstadial. Between 12.7 and 11.4 kyr BP pollen percentages of herbaceous taxa rapidly increased, whereas shrub taxa
无
2000-01-01
Via investigating typical Palaeozoic and Mesozoic petroleum-bearing basins in China by using thermal maturation theories of organic matter to improve the conventional Karweil's method,a new method to reconstruct hydrocarbon-generating histories of source rocks has been suggested.This method, combining geological background with geochemical information makes the calculated VRo closer to the measured one. Moreover, it enables us to make clear the hydrocarbon generation trend of source rocks during geological history. The method has the merits of simple calculation and objective presentation, especially suitable to basins whose sedimentation and tectonic movements are complicated.
Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert
2016-05-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified
Boelter, Fred W; Persky, Jacob D; Podraza, Daniel M; Bullock, William H
2016-02-01
Our reconstructed historical work scenarios incorporating a vintage 1950s locomotive can assist in better understanding the historical asbestos exposures associated with past maintenance and repairs and fill a literature data gap. Air sampling data collected during the exposure scenarios and analyzed by NIOSH 7400 (PCM) and 7402 (PCME) methodologies show personal breathing zone asbestiform fiber exposures were below the current OSHA exposure limits for the eight-hour TWA permissible exposure limit (PEL) of 0.1 f/cc (range woven tape lagging that may have been chrysotile asbestos and handled, removed, and reinstalled during repair and maintenance activities. We reconstructed historical work scenarios containing asbestos woven tape pipe lagging that have not been characterized in the published literature. The historical work scenarios were conducted by a retired railroad pipefitter with 37 years of experience working with materials and locomotives. PMID:26255644
Bal, Marie; Bal, Marie-Claude; Rendu, Christine; Ruas, Marie-Pierre; Campmajo, Pierre
2010-01-01
International audience This article uses a method that combines pedoanthracological and pedo-archaeological approaches to terraces, complemented with archaeological pastoral data, in order to reconstruct the history of ancient agricultural terraces on a slope of the Enveitg Mountain in the French Pyrenees. Four excavations revealed two stages of terrace construction that have been linked with vegetation dynamics, which had been established by analyses of charcoal from the paleosols and soi...
Zhang, Hao; Han, Hao; Liang, Zhengrong; Hu, Yifan; Liu, Yan; Moore, William; Ma, Jianhua; Lu, Hongbing
2015-01-01
Markov random field (MRF) model has been widely employed in edge-preserving regional noise smoothing penalty to reconstruct piece-wise smooth images in the presence of noise, such as in low-dose computed tomography (LdCT). While it preserves edge sharpness, its regional smoothing may sacrifice tissue image textures, which have been recognized as useful imaging biomarkers, and thus it may compromise clinical tasks such as differentiating malignant vs. benign lesions, e.g., lung nodules or colo...
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
O'Reilly, Joseph E; Puttick, Mark N; Parry, Luke; Tanner, Alastair R; Tarver, James E; Fleming, James; Pisani, Davide; Donoghue, Philip C J
2016-04-01
Different analytical methods can yield competing interpretations of evolutionary history and, currently, there is no definitive method for phylogenetic reconstruction using morphological data. Parsimony has been the primary method for analysing morphological data, but there has been a resurgence of interest in the likelihood-based Mk-model. Here, we test the performance of the Bayesian implementation of the Mk-model relative to both equal and implied-weight implementations of parsimony. Using simulated morphological data, we demonstrate that the Mk-model outperforms equal-weights parsimony in terms of topological accuracy, and implied-weights performs the most poorly. However, the Mk-model produces phylogenies that have less resolution than parsimony methods. This difference in the accuracy and precision of parsimony and Bayesian approaches to topology estimation needs to be considered when selecting a method for phylogeny reconstruction. PMID:27095266
Frederick, Katy; Schul, Johannes
2016-01-01
The katydid genus Neoconocephalus is characterized by high diversity of the acoustic communication system. Both male signals and female preferences have been thoroughly studied in the past. This study used Bayesian character state reconstruction to elucidate the evolutionary history of diverse call traits, based on an existing, well supported phylogenetic hypothesis. The most common male call pattern consisted of continuous calls comprising one fast pulse rate; this pattern is the likely ance...
Bayesian large-scale structure inference and cosmic web analysis
Leclercq, Florent
2015-01-01
Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...
Miraldo, Andreia; Hewitt, Godfrey M; Dear, Paul H; Paulo, Octavio S; Emerson, Brent C
2012-01-01
numts in L3 after secondary contact occurred prior to, or coincident with, the northward expansion of L3. This study shows that, in the context of phylogeographic analysis, numts can provide evidence for past demographic events and can be useful tools for the reconstruction of complex evolutionary...
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
210Pb is widely used for dating recent sediments in the aquatic environment; however, our experiences working in shallow coastal environments in the Pacific coast of Mexico have demonstrated that the potential of 210Pb for reliable historical reconstructions might be limited by the low 210Pb atmospheric fallout, sediment mixing, abundance of coarse sediments and the lack of 137Cs signal for 210Pb corroboration. This work discusses the difficulties in obtaining adequate sedimentary records for geochronological reconstruction in such active and complex settings, including examples of 210Pb geochronologies based on sediment profiles collected in two contrasting areas coastal areas (mudflats associated to coastal lagoons of Sinaloa State and the continental shelf of the Gulf of Tehuantepec), in which geochemical data was used to support the temporal frame established and the changes in sediment supply recorded in the sediment cores which were related to the development of land-based activities during the last century.
Nabholz, Benoit; Lartillot, Nicolas
2013-01-01
The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
3D Surface Reconstruction and Automatic Camera Calibration
Jalobeanu, Andre
2004-01-01
Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Sichani, Mahdi Teimouri; Brincker, Rune
2008-01-01
Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied on...
Radermacher, Pascal; Schöne, Bernd R.; Gischler, Eberhard; Oschmann, Wolfgang; Thébault, Julien; Fiebig, Jens
2010-05-01
The shell of the queen conch Strombus gigas provides a rapidly growing palaeoenvironmental proxy archive, allowing the detailed reconstruction of important life-history traits such as ontogeny, growth rate and growth seasonality. In this study, modern sclerochronological methods are used to cross-date the palaeotemperatures derived from the shell with local sea surface temperature (SST) records. The growth history of the shell suggests a bimodal seasonality in growth, with the growing season confined to the interval between April and November. In Glovers Reef, offshore Belize, the queen conch accreted shell carbonate at rates of up to 6 mm day-1 during the spring (April-June) and autumn (September-November). However a reduced period of growth occurred during the mid-summer months (July-August). The shell growth patterns indicate a positive response to annual seasonality with regards to precipitation. It seems likely that when precipitation levels are high, food availability is increased as the result of nutrient input to the ecosystem in correspondence with an increase in coastal runoff. Slow growth rates occur when precipitation, and as a consequence riverine runoff, is low. The SST however appears to influence growth only on a secondary level. Despite the bimodal growing season and the winter cessation in growth, the growth rates reconstructed here from two S. gigas shells are among the fastest yet reported for this species. The S. gigas specimens from Belize reached their final shell height (of 22.7 and 23.5 cm in distance between the apex and the siphonal notch) at the transition to adulthood in just 2 years. The extremely rapid growth as observed in this species permits detailed, high-resolution reconstructions of life-history traits where sub-daily resolutions can be achieved with ease. The potential for future studies has yet to be further explored. Queen conch sclerochronology provides an opportunity to recover extremely high-resolution palaeotemperature
Botzem, S.; S Quack
2009-01-01
The development of the current International Accounting Standards Board (IASB) from the earlier International Accounting Standards Committee (IASC) provides insight into many issues of international financial reporting, among them the characteristics of international accounting standards themselves. This article reviews Camfferman and Zeff’s [Camfferman, K., & Zeff, S. A. (2007). Financial reporting and global capital markets. A history of the international accounting standards committee 1973...
Koblmüller, S; Robert K Wayne; Leonard, Jennifer A.
2012-01-01
Recurrent cycles of climatic change during the Quaternary period have dramatically affected the population genetic structure of many species. We reconstruct the recent demographic history of the coyote (Canis latrans) through the use of Bayesian techniques to examine the effects of Late Quaternary climatic perturbations on the genetic structure of a highly mobile generalist species. Our analysis reveals a lack of phylogeographic structure throughout the range but past population size changes ...
Roopnarine, P. D.; Anderson, L.; Roopnarine, D.; Gillikin, D. P.; Leal, J.
2012-12-01
The Earth's environments are changing more rapidly today than at almost any time in the Phanerozoic. These changes are driven by human activities, and include climate change, landscape alteration, fragmentation and destruction, environmental pollution, species overexploitation, and invasive species. The rapidity of the changes challenges our best efforts to document what is changing, how it has changed, and what has been lost. Central to these efforts, therefore, is the proper documentation, archiving and curation of past environments. Natural history and other research collections form the core of this documentation, and have proven vital to recent studies of environmental change. Those collections are, however, generally under-utilized and under-appreciated by the general research community. Also, their utility is hampered by insufficient availability of the data, and the very nature of what has been collected in the past. Past collections emphasized a typological approach, placing emphasis on individual specimens and diversity, whether geological or biological, while what is needed today is greater emphasis on archiving entire environments. The concept of shifting baselines establishes that even on historical time scales, the notion of what constitutes an unaltered environment is biased by a lack of documentation and understanding of environments in the recent past. Baselines are necessary, however, for the proper implementation of mitigating procedures, for environmental restoration or remediation, and for predicting the near-term future. Here we present results from a study of impacts of the Deepwater Horizon oil spill (DWH) on the American oyster Crassostrea virginica. Natural history collections of specimens from the Gulf and elsewhere have been crucial to this effort, and serve as an example of how important such collections are to current events. We are examining the effects of spill exposure on shell growth and tissue development, as well as the potential
The purpose of this study is to examine the sedimentation history of the central floodplain area of the Bengal Delta Plain in West Bengal, India. Sediments from two boreholes were analyzed regarding lithology, geochemistry and the stable isotopic composition of embedded organic matter. Different lithofacies were distinguished that reflect frequent changes in the prevailing sedimentary depositional environment of the study area. The lowest facies comprises poorly sorted fluvial sediments composed of fine gravel to clay pointing at high transport energy and intense relocation processes. This facies is considered to belong to an early Holocene lowstand systems tract that followed the last glacial maximum. Fine to medium sands above it mark a gradual change towards a transgressive systems tract. Upwards increasing proportions of silt and the stable isotopic composition of embedded organic matter both indicate a gradual change from fluvial channel infill sediments towards more estuarine and marine influenced deposits. Youngest sediments are composed of clayey and silty overbank deposits of the Hooghly River that have formed a vast low-relief delta-floodplain. Close to the surface, small concretions of secondary Mn-oxides and Fe-(oxyhydr)oxides occur and mark the fluctuation range of the unsaturated zone. These concretions are accompanied by relatively high contents of trace elements such as Zn, Ni, Cu, and As. To sum up, the outcomes of this study provide new insights into the complex sedimentation history of the barely investigated central floodplain area of West Bengal
Hepp, Johannes; Tuthorn, Mario; Zech, Roland; Mügler, Ines; Schlütz, Frank; Zech, Wolfgang; Zech, Michael
2015-10-01
Over the past decades, δ18O and δ2H analyses of lacustrine sediments became an invaluable tool in paleohydrology and paleolimnology for reconstructing the isotopic composition of past lake water and precipitation. However, based on δ18O or δ2H records alone, it can be challenging to distinguish between changes of the precipitation signal and changes caused by evaporation. Here we propose a coupled δ18O-δ2H biomarker approach that provides the possibility to disentangle between these two factors. The isotopic composition of long chain n-alkanes (n-C25, n-C27, n-C29, n-C31) were analyzed in order to establish a 16 ka Late Glacial and Holocene δ2H record for the sediment archive of Lake Panch Pokhari in High Himalaya, Nepal. The δ2Hn-alkane record generally corroborates a previously established δ18Osugar record reporting on high values characterizing the deglaciation and the Older and the Younger Dryas, and low values characterizing the Bølling and the Allerød periods. Since the investigated n-alkane and sugar biomarkers are considered to be primarily of aquatic origin, they were used to reconstruct the isotopic composition of lake water. The reconstructed deuterium excess of lake water ranges from +57‰ to -85‰ and is shown to serve as proxy for the evaporation history of Lake Panch Pokhari. Lake desiccation during the deglaciation, the Older Dryas and the Younger Dryas is affirmed by a multi-proxy approach using the Hydrogen Index (HI) and the carbon to nitrogen ratio (C/N) as additional proxies for lake sediment organic matter mineralization. Furthermore, the coupled δ18O and δ2H approach allows disentangling the lake water isotopic enrichment from variations of the isotopic composition of precipitation. The reconstructed 16 ka δ18Oprecipitation record of Lake Panch Pokhari is well in agreement with the δ18O records of Chinese speleothems and presumably reflects the Indian Summer Monsoon variability.
Rubin, Donald B.
1981-01-01
The Bayesian bootstrap is the Bayesian analogue of the bootstrap. Instead of simulating the sampling distribution of a statistic estimating a parameter, the Bayesian bootstrap simulates the posterior distribution of the parameter; operationally and inferentially the methods are quite similar. Because both methods of drawing inferences are based on somewhat peculiar model assumptions and the resulting inferences are generally sensitive to these assumptions, neither method should be applied wit...
Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; Peng, Shushi; Yue, Chao; Piao, Shilong; Wang, Tao; Hauglustaine, Didier A.; Soussana, Jean-Francois; Peregon, Anna; Kosykh, Natalya; Mironycheva-Tokareva, Nina
2016-06-01
Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5° by 0.5°. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 1901-2012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, rising CO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 × 106 km2 in 1901 to 12.3 × 106 km2 in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and interannual variability of grassland productivity at global scale well and thus is
Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida, E-mail: menezes@cdtn.br, E-mail: cida@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Reator e Tecnicas Analiticas. Laboratorio de Ativacao Neutronica; Sabino, Claudia de V.S. [PUC-Minas, Belo Horizonte, MG (Brazil)
2011-07-01
Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, {sub k}0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
An introduction to Gaussian Bayesian networks.
Grzegorczyk, Marco
2010-01-01
The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469
Blake-Mizen, Keziah; Bailey, Ian; Carlson, Anders; Stoner, Joe; Hatfield, Rob; Xuan, Chuang; Lawrence, Kira
2016-04-01
Should it ever melt entirely, the Greenland Ice Sheet (GIS) would contribute to ~7 metres of global sea-level rise. Understanding how the GIS might respond to anthropogenic-induced global warming over the coming century is therefore important. Central to this goal is constraining how this ice sheet has responded to radiative forcing during both warmer- and colder-than-present climate states in the geological past. Little is known in detail, however, about the GIS prior to the Late Pleistocene and large uncertainty exists in our understanding of its history across the last great climate transition during the Cenozoic, the intensification of Northern Hemisphere glaciation (iNHG; ~3.6-2.4 Ma). This time encompasses two intervals of interest: (1) the mid-Piacenzian warm period (mPWP, ~3.3-3 Ma), widely considered an analogue for a future equilibrium climate state when atmospheric CO2 levels were comparable to modern (~400 ppmv) and sea-level and global temperatures were elevated relative to today (by ~25 metres and ~2-3°C) and, (2) a subsequent gradual deterioration in global climate and decline in atmospheric CO2 that led to the development of Quaternary-magnitude glaciations from ~2.5 Ma. Important unresolved questions include: to what extent did the southern GIS retreat during the mPWP, and when did a modern-day sized GIS first develop during iNHG? To tackle these issues our project focuses on the southern GIS history that can be extracted from Eirik Drift IODP Site U1307 between ~3.3 and 2.2 Ma. To achieve this we are developing an independent orbital-resolution age model, one of the first for high-latitude marine sediments deposited during iNHG, by producing a relative paleointensity (RPI) record for Site U1307; and generating multi-proxy geochemical and sedimentological datasets that track the provenance of the sand and bulk terrigenous sediment fraction glacially eroded by the southern GIS and delivered to the study site by both ice-rafting and the Western
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Shamshiri, Sorour; Henriques, Bruno M; Tojeiro, Rita; Lemson, Gerard; Oliver, Seb J; Wilkins, Stephen
2015-01-01
We adapt the L-Galaxies semi-analytic model to follow the star-formation histories (SFH) of galaxies -- by which we mean a record of the formation time and metallicities of the stars that are present in each galaxy at a given time. We use these to construct stellar spectra in post-processing, which offers large efficiency savings and allows user-defined spectral bands and dust models to be applied to data stored in the Millennium data repository. We contrast model SFHs from the Millennium Simulation with observed ones from the VESPA algorithm as applied to the SDSS-7 catalogue. The overall agreement is good, with both simulated and SDSS galaxies showing a steeper SFH with increased stellar mass. The SFHs of blue and red galaxies, however, show poor agreement between data and simulations, which may indicate that the termination of star formation is too abrupt in the models. The mean star-formation rate (SFR) of model galaxies is well-defined and is accurately modelled by a double power law at all redshifts: SF...
Williams, Rebecca M. E.; Weitz, Catherine M.
2014-11-01
New details of the aqueous history in the southwestern Melas Chasma elevated basin have been revealed from analysis of high-resolution image, topographic and spectral datasets. We have identified eleven fan-shaped landforms that reflect various depositional environments. A distinctive marker bed with inferred indurated aeolian bedforms is within the stratigraphic record of presumed lacustrine deposits. This observation, taken together with the stratigraphic succession of fan-shaped deposits indicates fluctuating lake levels with, at a minimum, early and late-stage lake highstands. Tributary drainage pattern in the western valley network changed from a dendritic to a meandering system, recording a shift in fluvial activity that is consistent with fluctuating lake levels. Only a few hydrated minerals have been detected in the study region, the most common being opal which appears to represent younger alteration and deposition within the basin. Landform scale was used to estimate average discharge (∼30 m3/s), formative discharge (200-300 m3/s), and fan formation timescale, which further inform the duration of lacustrine activity within the basin. Warm surface conditions and precipitation recharge of source basins is required to generate and sustain this long-lived lake over periods of at least centuries to millennia during the Late Hesperian to Early Amazonian.
Payen, Thibaut; Murat, Claude; Martin, Francis
2016-08-01
Truffles are ascomycete fungi belonging to genus Tuber, and they form ectomycorrhizal associations with trees and shrubs. Transposable elements constitute more than 50 % of the black Périgord truffle (Tuber melanosporum) genome, which are mainly class 1 gypsy retrotransposons, but their impact on its genome is unknown. The aims of this study are to investigate the diversity of gypsy retrotransposons in this species and their evolutionary history by analysing the reference genome and six resequenced genomes of different geographic accessions. Using the reverse transcriptase sequences, six different gypsy retrotransposon clades were identified. Tmt1 and Tmt6 are the most abundant transposable elements, representing 14 and 13 % of the T. melanosporum genome, respectively. Tmt6 showed a major burst of proliferation between 1 and 4 million years ago, but evidence of more recent transposition was observed. Except for Tmt2, the other clades tend to aggregate, and their mode of transposition excluded the master copy model. This suggests that each new copy has the same probability of transposing as other copies. This study provides a better view of the diversity and dynamic nature of gypsy retrotransposons in T. melanosporum. Even if the major gypsy retrotransposon bursts are old, some elements seem to have transposed recently, suggesting that they may continue to model the truffle genomes. PMID:27025914
Williamson, Fiona; Allan, Rob; Switzer, Adam D.; Chan, Johnny C. L.; Wasson, Robert James; D'Arrigo, Rosanne; Gartner, Richard
2015-12-01
The value of historic observational weather data for reconstructing long-term climate patterns and the detailed analysis of extreme weather events has long been recognized (Le Roy Ladurie, 1972; Lamb, 1977). In some regions however, observational data has not been kept regularly over time, or its preservation and archiving has not been considered a priority by governmental agencies. This has been a particular problem in Southeast Asia where there has been no systematic country-by-country method of keeping or preserving such data, the keeping of data only reaches back a few decades, or where instability has threatened the survival of historic records. As a result, past observational data are fragmentary, scattered, or even absent altogether. The further we go back in time, the more obvious the gaps. Observational data can be complimented however by historical documentary or proxy records of extreme events such as floods, droughts and other climatic anomalies. This review article highlights recent initiatives in sourcing, recovering, and preserving historical weather data and the potential for integrating the same with proxy (and other) records. In so doing, it focuses on regional initiatives for data research and recovery - particularly the work of the international Atmospheric Circulation Reconstructions over the Earth's (ACRE) Southeast Asian regional arm (ACRE SEA) - and the latter's role in bringing together disparate, but interrelated, projects working within this region. The overarching goal of the ACRE SEA initiative is to connect regional efforts and to build capacity within Southeast Asian institutions, agencies and National Meteorological and Hydrological Services (NMHS) to improve and extend historical instrumental, documentary and proxy databases of Southeast Asian hydroclimate, in order to contribute to the generation of high-quality, high-resolution historical hydroclimatic reconstructions (reanalyses) and, to build linkages with humanities researchers
Frühwirth-Schnatter, Sylvia
1990-01-01
In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of 137Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform
Inverse problems in the Bayesian framework
The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian
Li, Lianlin
2009-01-01
We present a Bayesian framework for reconstruction of subsurface hydraulic properties from nonlinear dynamic flow data by imposing sparsity on the distribution of the solution coefficients in a compression transform domain.
SOFOMORE: Combined EEG source and forward model reconstruction
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2009-01-01
We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including the representat......We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including...... the representation of the cortical surface, conductivity distribution, and electrode positions. We demonstrate in both simulated and real EEG data that reconstruction of the forward model improves localization of the underlying sources....
Bayesian molecular phylogenetics: estimation of divergence dates and hypothesis testing
Aris-Brosou, S.
2002-01-01
With the advent of automated sequencing, sequence data are now available to help us understand the functioning of our genome, as well as its history. To date,powerful methods such as maximum likelihood have been used to estimate its mode and tempo of evolution and its branching pattern. However, these methods appear to have some limitations. The purpose of this thesis is to examine these issues in light of Bayesian modelling, taking advantage of some recent advances in Bayesian compu...
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Bayesian exploratory factor analysis
Gabriella Conti; Sylvia Frühwirth-Schnatter; James Heckman; Rémi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identifi cation criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study c...
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Bayesian Exploratory Factor Analysis
Gabriella Conti; Sylvia Fruehwirth-Schnatter; Heckman, James J.; Remi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \\emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo s...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo st...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Carbonetto, Peter; Kisynski, Jacek; De Freitas, Nando; Poole, David L
2012-01-01
The Bayesian Logic (BLOG) language was recently developed for defining first-order probability models over worlds with unknown numbers of objects. It handles important problems in AI, including data association and population estimation. This paper extends BLOG by adopting generative processes over function spaces - known as nonparametrics in the Bayesian literature. We introduce syntax for reasoning about arbitrary collections of objects, and their properties, in an intuitive manner. By expl...
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
Leonid Hanin
2011-09-01
Full Text Available This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1 age and volume of the primary tumor at presentation; and (2 volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1 inception of the first metastasis occurred when the primary tumor was undetectable; (2 inception of all or most of the surveyed metastases occurred before the start of treatment; (3 the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4 surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer.
Boonyatumanond, Ruchaya [Environmental Research and Training Center, Pathumthani 12120 (Thailand); Wattayakorn, Gullaya [Department of Marine Science, Faculty of Science, Chulalongkorn University, Bangkok 10330 (Thailand); Amano, Atsuko [Graduate School of Science and Engineering, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Inouchi, Yoshio [Center for Marine Environmental Studies, Ehime University, 2-5 Bunkyou-cho, Matsuyama (Japan); Takada, Hideshige [Laboratory of Organic Geochemistry, Tokyo University of Agriculture and Technology, Fuchu, Tokyo 183-8509 (Japan)]. E-mail: shige@cc.tuat.ac.jp
2007-05-15
This paper reports the first reconstruction of a pollution history in tropical Asia from sediment cores. Four sediment core samples were collected from an offshore transect in the upper Gulf of Thailand and were analyzed for organic micropollutants. The cores were dated by measurement of {sup 137}Cs and geochronometric molecular markers (linear alkylbenzenes, LABs; and tetrapropylene-type alkylbenzenes, TABs). Polychlorinated biphenyl (PCB) concentrations showed a subsurface maximum in layers corresponding to the 1970s, indicating the effectiveness of regulation of PCBs in Thailand. LAB concentrations increased over time, indicating the increase in input of sewage into the Gulf during the last 30 years. Hopanes, biomarkers of petroleum pollution, also increased over time, indicating that the inputs of automobile-derived hydrocarbons to the coastal zone has been increasing owing to the increased number of cars in Thailand since the 1950s. Polycyclic aromatic hydrocarbons (PAHs) increased in the layers corresponding to the 1950s and 1960s, probably because of the increased inputs of automobile-derived PAHs. PAH concentrations in the upper layers corresponding to the 1970s and later remained constant or increased. The absence of a subsurface maximum of PAHs contrasts with results observed in industrialized countries. This can be explained by the facts that the Thai economy did not depend on coal as an energy source in the 1960s and that economic growth has continued since the 1970s to the present. The deposition flux of PAHs and hopanes showed a dramatic offshore decrease, whereas that of LABs was uniform.
This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1) age and volume of the primary tumor at presentation; and (2) volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1) inception of the first metastasis occurred when the primary tumor was undetectable; (2) inception of all or most of the surveyed metastases occurred before the start of treatment; (3) the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4) surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients) in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer
Bayesian Vision for Shape Recovery
Jalobeanu, Andre
2004-01-01
We present a new Bayesian vision technique that aims at recovering a shape from two or more noisy observations taken under similar lighting conditions. The shape is parametrized by a piecewise linear height field, textured by a piecewise linear irradiance field, and we assume Gaussian Markovian priors for both shape vertices and irradiance variables. The observation process. also known as rendering, is modeled by a non-affine projection (e.g. perspective projection) followed by a convolution with a piecewise linear point spread function. and contamination by additive Gaussian noise. We assume that the observation parameters are calibrated beforehand. The major novelty of the proposed method consists of marginalizing out the irradiances considered as nuisance parameters, which is achieved by Laplace approximations. This reduces the inference to minimizing an energy that only depends on the shape vertices, and therefore allows an efficient Iterated Conditional Mode (ICM) optimization scheme to be implemented. A Gaussian approximation of the posterior shape density is computed, thus providing estimates both the geometry and its uncertainty. We illustrate the effectiveness of the new method by shape reconstruction results in a 2D case. A 3D version is currently under development and aims at recovering a surface from multiple images, reconstructing the topography by marginalizing out both albedo and shading.
The NIFTY way of Bayesian signal inference
We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy
2015-01-01
基于简易深度成像设备的动作捕捉系统因其与传统设备相比更加廉价且易于使用而倍受关注。然而，此类设备图像分辨率很低，肢体间互相遮挡，缺乏3维动作重建的基本数据条件。该文融合人体关节点父子关系与关节点在运动中的多阶马尔可夫性，提出一个描述人体关节点空间关系与动态特性的动态贝叶斯网络(DBN)模型，基于该 DBN 模型并利用高尔夫挥杆运动的相似性，构建了一种高尔夫挥杆3维重建系统 DBN-Motion(DBN-based Motion reconstruction system)，使用简易深度成像设备Kinect，有效地解决了肢体遮挡的问题，实现了高尔夫挥杆动作的捕获和3维重建。实验结果表明，该系统能够在重建精度上媲美商用光学动作捕捉系统。%The simple depth imaging device gains more and more attention because of its lower cost and easy- to-use property compared with traditional motion capture systems. However, this kind of devices lack the basic data condition of 3D motion reconstruction due to low resolution, occlusions, and mixing up of body parts. In this paper, a Dynamic Bayesian Network (DBN) model is proposed to describe the spatial and temporal characteristics of human body joints. The model is based on fusion of the parent-child characteristics of joints and multi-order Markov property of joint during motion. A golf swing capture and reconstruction system DBN-Motion (DBN-based Motion reconstruction system), is presented based on the DBN model and the similarity of swing with a simple depth imaging device, Kinect, as capturing device. The proposed system effectively solves the problem of occlusions and mixing up of body parts, and successfully captures and reconstructs golf swing in 3D space. Experimental results prove that the proposed system can achieve comparable reconstruction accuracy to the commercial optical motion caption system.
Preliminary investigation of a Bayesian network for mammographic diagnosis of breast cancer.
Kahn, C. E.; Roberts, L. M.; K. Wang; Jenks, D.; Haddawy, P.
1995-01-01
Bayesian networks use the techniques of probability theory to reason under conditions of uncertainty. We investigated the use of Bayesian networks for radiological decision support. A Bayesian network for the interpretation of mammograms (MammoNet) was developed based on five patient-history features, two physical findings, and 15 mammographic features extracted by experienced radiologists. Conditional-probability data, such as sensitivity and specificity, were derived from peer-reviewed jour...
St John, K. K.; Jones, M. H.; Leckie, R. M.; Pound, K. S.; Krissek, L. A.
2013-12-01
develop detailed instructor guides to accompany each module. After careful consideration of dissemination options, we choose to publish the full suite of exercise modules as a commercially-available book, Reconstructing Earth's Climate History, while also providing open online access to a subset of modules. Its current use in undergraduate paleoclimatology courses, and the availability of select modules for use in other courses demonstrate that creative, hybrid options can be found for lasting dissemination, and thus sustainability. In achieving our goal of making science accessible, we believe we have followed a curriculum development process and sustainability path that can be used by others to meet needs in earth, ocean, and atmospheric science education. Next steps for REaCH include exploration of its use in blended learning classrooms, and at minority serving institutions.
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Loredo, T J
2004-01-01
I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...
Bayesian and frequentist inequality tests
David M. Kaplan; Zhuo, Longhao
2016-01-01
Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the...... corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Bayesian Geostatistical Design
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...
Krejsa, Jiří; Věchet, S.
Bratislava: Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...
Bayesian Independent Component Analysis
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Noncausal Bayesian Vector Autoregression
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution as a...
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an