Directory of Open Access Journals (Sweden)
Quentin Rougemont
2016-04-01
Full Text Available Inferring the history of isolation and gene flow during species divergence is a central question in evolutionary biology. The European river lamprey (Lampetra fluviatilis and brook lamprey (L. planeri show a low reproductive isolation but have highly distinct life histories, the former being parasitic-anadromous and the latter non-parasitic and freshwater resident. Here we used microsatellite data from six replicated population pairs to reconstruct their history of divergence using an approximate Bayesian computation framework combined with a random forest model. In most population pairs, scenarios of divergence with recent isolation were outcompeted by scenarios proposing ongoing gene flow, namely the Secondary Contact (SC and Isolation with Migration (IM models. The estimation of demographic parameters under the SC model indicated a time of secondary contact close to the time of speciation, explaining why SC and IM models could not be discriminated. In case of an ancient secondary contact, the historical signal of divergence is lost and neutral markers converge to the same equilibrium as under the less parameterized model allowing ongoing gene flow. Our results imply that models of secondary contacts should be systematically compared to models of divergence with gene flow; given the difficulty to discriminate among these models, we suggest that genome-wide data are needed to adequately reconstruct divergence history.
Bayesian History Reconstruction of Complex Human Gene Clusters on a Phylogeny
Vinař, Tomáš; Song, Giltae; Siepel, Adam
2009-01-01
Clusters of genes that have evolved by repeated segmental duplication present difficult challenges throughout genomic analysis, from sequence assembly to functional analysis. Improved understanding of these clusters is of utmost importance, since they have been shown to be the source of evolutionary innovation, and have been linked to multiple diseases, including HIV and a variety of cancers. Previously, Zhang et al. (2008) developed an algorithm for reconstructing parsimonious evolutionary histories of such gene clusters, using only human genomic sequence data. In this paper, we propose a probabilistic model for the evolution of gene clusters on a phylogeny, and an MCMC algorithm for reconstruction of duplication histories from genomic sequences in multiple species. Several projects are underway to obtain high quality BAC-based assemblies of duplicated clusters in multiple species, and we anticipate that our method will be useful in analyzing these valuable new data sets.
Bayesian image reconstruction: Application to emission tomography
Energy Technology Data Exchange (ETDEWEB)
Nunez, J.; Llacer, J.
1989-02-01
In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.
Bayesian Image Reconstruction Based on Voronoi Diagrams
Cabrera, G F; Hitschfeld, N
2007-01-01
We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.
Photogrammetric Reconstruction with Bayesian Information
Masiero, A.; Fissore, F.; Guarnieri, A.; Pirotti, F.; Vettore, A.
2016-06-01
Nowadays photogrammetry and laser scanning methods are the most wide spread surveying techniques. Laser scanning methods usually allow to obtain more accurate results with respect to photogrammetry, but their use have some issues, e.g. related to the high cost of the instrumentation and the typical need of high qualified personnel to acquire experimental data on the field. Differently, photogrammetric reconstruction can be achieved by means of low cost devices and by persons without specific training. Furthermore, the recent diffusion of smart devices (e.g. smartphones) embedded with imaging and positioning sensors (i.e. standard camera, GNSS receiver, inertial measurement unit) is opening the possibility of integrating more information in the photogrammetric reconstruction procedure, in order to increase its computational efficiency, its robustness and accuracy. In accordance with the above observations, this paper examines and validates new possibilities for the integration of information provided by the inertial measurement unit (IMU) into the photogrammetric reconstruction procedure, and, to be more specific, into the procedure for solving the feature matching and the bundle adjustment problems.
Bayesian Cosmic Web Reconstruction: BARCODE for Clusters
Bos, E G Patrick; Kitaura, Francisco; Cautun, Marius
2016-01-01
We describe the Bayesian BARCODE formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the ...
Bayesian Cosmic Web Reconstruction: BARCODE for Clusters
Bos, E. G. Patrick; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius
2016-10-01
We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the basis of observational data, without the need for a correction of redshift space artifacts. In this contribution we provide a general overview of the the Cosmic Web connection with clusters and a description of the Bayesian BARCODE formalism. We conclude with a presentation of its successful workings with respect to test runs based on a simulated large scale matter distribution, in physical space as well as in redshift space.
Structure-based bayesian sparse reconstruction
Quadeer, Ahmed Abdul
2012-12-01
Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.
Bayesian Probabilities and the Histories Algebra
Marlow, Thomas
2006-01-01
We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.
Photoacoustic image reconstruction based on Bayesian compressive sensing algorithm
Institute of Scientific and Technical Information of China (English)
Mingjian Sun; Naizhang Feng; Yi Shen; Jiangang Li; Liyong Ma; Zhenghua Wu
2011-01-01
The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain. However, the sparsity of photoacoustic signals is destroyed because noises always exist. Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm. In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic images based on a set of noisy CS measurements. Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.%@@ The photoacoustic tomography (PAT) method, based on compressive sensing (CS) theory, requires that,for the CS reconstruction, the desired image should have a sparse representation in a known transform domain.However, the sparsity of photoacoustic signals is destroyed because noises always exist.Therefore,the original sparse signal cannot be effectively recovered using the general reconstruction algorithm.In this study, Bayesian compressive sensing (BCS) is employed to obtain highly sparse representations of photoacoustic inages based on a set of noisy CS measurements.Results of simulation demonstrate that the BCS-reconstructed image can achieve superior performance than other state-of-the-art CS-reconstruction algorithms.
Hierarchical Bayesian sparse image reconstruction with application to MRFM
Dobigeon, Nicolas; Tourneret, Jean-Yves
2008-01-01
This paper presents a hierarchical Bayesian model to reconstruct sparse images when the observations are obtained from linear transformations and corrupted by an additive white Gaussian noise. Our hierarchical Bayes model is well suited to such naturally sparse image applications as it seamlessly accounts for properties such as sparsity and positivity of the image via appropriate Bayes priors. We propose a prior that is based on a weighted mixture of a positive exponential distribution and a mass at zero. The prior has hyperparameters that are tuned automatically by marginalization over the hierarchical Bayesian model. To overcome the complexity of the posterior distribution, a Gibbs sampling strategy is proposed. The Gibbs samples can be used to estimate the image to be recovered, e.g. by maximizing the estimated posterior distribution. In our fully Bayesian approach the posteriors of all the parameters are available. Thus our algorithm provides more information than other previously proposed sparse reconstr...
PDE regularization for Bayesian reconstruction of emission tomography
Wang, Zhentian; Zhang, Li; Xing, Yuxiang; Zhao, Ziran
2008-03-01
The aim of the present study is to investigate a type of Bayesian reconstruction which utilizes partial differential equations (PDE) image models as regularization. PDE image models are widely used in image restoration and segmentation. In a PDE model, the image can be viewed as the solution of an evolutionary differential equation. The variation of the image can be regard as a descent of an energy function, which entitles us to use PDE models in Bayesian reconstruction. In this paper, two PDE models called anisotropic diffusion are studied. Both of them have the characteristics of edge-preserving and denoising like the popular median root prior (MRP). We use PDE regularization with an Ordered Subsets accelerated Bayesian one step late (OSL) reconstruction algorithm for emission tomography. The OS accelerated OSL algorithm is more practical than a non-accelerated one. The proposed algorithm is called OSEM-PDE. We validated the OSEM-PDE using a Zubal phantom in numerical experiments with attenuation correction and quantum noise considered, and the results are compared with OSEM and an OS version of MRP (OSEM-MRP) reconstruction. OSEM-PDE shows better results both in bias and variance. The reconstruction images are smoother and have sharper edges, thus are more applicable for post processing such as segmentation. We validate this using a k-means segmentation algorithm. The classic OSEM is not convergent especially in noisy condition. However, in our experiment, OSEM-PDE can benefit from OS acceleration and keep stable and convergent while OSEM-MRP failed to converge.
Chakraborty, Shubhankar; Roy Chaudhuri, Partha; Das, Prasanta Kr
2016-07-01
In this communication, a novel optical technique has been proposed for the reconstruction of the shape of a Taylor bubble using measurements from multiple arrays of optical sensors. The deviation of an optical beam passing through the bubble depends on the contour of bubble surface. A theoretical model of the deviation of a beam during the traverse of a Taylor bubble through it has been developed. Using this model and the time history of the deviation captured by the sensor array, the bubble shape has been reconstructed. The reconstruction has been performed using an inverse algorithm based on Bayesian inference technique and Markov chain Monte Carlo sampling algorithm. The reconstructed nose shape has been compared with the true shape, extracted through image processing of high speed images. Finally, an error analysis has been performed to pinpoint the sources of the errors.
Bayesian 3d velocity field reconstruction with VIRBIuS
Lavaux, G
2015-01-01
I describe a new Bayesian based algorithm to infer the full three dimensional velocity field from observed distances and spectroscopic galaxy catalogues. In addition to the velocity field itself, the algorithm reconstructs true distances, some cosmological parameters and specific non-linearities in the velocity field. The algorithm takes care of selection effects, miscalibration issues and can be easily extended to handle direct fitting of, e.g., the inverse Tully-Fisher relation. I first describe the algorithm in details alongside its performances. This algorithm is implemented in the VIRBIuS (VelocIty Reconstruction using Bayesian Inference Software) software package. I then test it on different mock distance catalogues with a varying complexity of observational issues. The model proved to give robust measurement of velocities for mock catalogues of 3,000 galaxies. I expect the core of the algorithm to scale to tens of thousands galaxies. It holds the promises of giving a better handle on future large and d...
Reconstructing Fire History: An Exercise in Dendrochronology
Lafon, Charles W.
2005-01-01
Dendrochronology is used widely to reconstruct the history of forest disturbances. I created an exercise that introduces the use of dendrochronology to investigate fire history and forest dynamics. The exercise also demonstrates how the dendrochronological technique of crossdating is employed to age dead trees and identify missing rings. I…
Directory of Open Access Journals (Sweden)
Ricardo Fernandes
Full Text Available Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals was developed for use in diet reconstruction studies. FRUITS incorporates the capability to account for dietary routing, that is, the contribution of different food fractions (e.g. macronutrients towards a dietary proxy signal measured in the consumer. FRUITS also provides relatively straightforward means for the introduction of prior information on the relative dietary contributions of food groups or food fractions. This type of prior may originate, for instance, from physiological or metabolic studies. FRUITS performance was tested using simulated data and data from a published controlled animal feeding experiment. The feeding experiment data was selected to exemplify the application of the novel capabilities incorporated into FRUITS but also to illustrate some of the aspects that need to be considered when handling data within diet reconstruction studies. FRUITS accurately predicted dietary intakes, and more precise estimates were obtained for dietary scenarios in which expert prior information was included. FRUITS represents a useful tool to achieve accurate and precise food intake estimates in diet reconstruction studies within different scientific fields (e.g. ecology, forensics, archaeology, and dietary physiology.
Diffusion archeology for diffusion progression history reconstruction.
Sefer, Emre; Kingsford, Carl
2016-11-01
Diffusion through graphs can be used to model many real-world processes, such as the spread of diseases, social network memes, computer viruses, or water contaminants. Often, a real-world diffusion cannot be directly observed while it is occurring - perhaps it is not noticed until some time has passed, continuous monitoring is too costly, or privacy concerns limit data access. This leads to the need to reconstruct how the present state of the diffusion came to be from partial diffusion data. Here, we tackle the problem of reconstructing a diffusion history from one or more snapshots of the diffusion state. This ability can be invaluable to learn when certain computer nodes are infected or which people are the initial disease spreaders to control future diffusions. We formulate this problem over discrete-time SEIRS-type diffusion models in terms of maximum likelihood. We design methods that are based on submodularity and a novel prize-collecting dominating-set vertex cover (PCDSVC) relaxation that can identify likely diffusion steps with some provable performance guarantees. Our methods are the first to be able to reconstruct complete diffusion histories accurately in real and simulated situations. As a special case, they can also identify the initial spreaders better than the existing methods for that problem. Our results for both meme and contaminant diffusion show that the partial diffusion data problem can be overcome with proper modeling and methods, and that hidden temporal characteristics of diffusion can be predicted from limited data.
Cao, Liji; Peter, Jörg
2010-05-01
Following the assembly of a triple-modality SPECT-CT-OT small animal imaging system providing intrinsically co-registered projection data of all three submodalities and under the assumption and investigation of dual-labeled probes consisting of both fluorophores and radionuclides, a novel multi-modal reconstruction strategy is presented in this paper aimed at improving fluorescence-mediated tomography (FMT). The following reconstruction procedure is proposed: firstly, standard x-ray CT image reconstruction is performed employing the FDK algorithm. Secondly, standard SPECT image reconstruction is performed using OSEM. Thirdly, from the reconstructed CT volume data the surface boundary of the imaged object is extracted for finite element definition. Finally, the reconstructed SPECT data are used as a priori information within a Bayesian reconstruction framework for optical (FMT) reconstruction. We provide results of this multi-modal approach using phantom experimental data and illustrate that this strategy does suppress artifacts and facilitates quantitative analysis for optical imaging studies.
A novel reconstruction algorithm for bioluminescent tomography based on Bayesian compressive sensing
Wang, Yaqi; Feng, Jinchao; Jia, Kebin; Sun, Zhonghua; Wei, Huijun
2016-03-01
Bioluminescence tomography (BLT) is becoming a promising tool because it can resolve the biodistribution of bioluminescent reporters associated with cellular and subcellular function through several millimeters with to centimeters of tissues in vivo. However, BLT reconstruction is an ill-posed problem. By incorporating sparse a priori information about bioluminescent source, enhanced image quality is obtained for sparsity based reconstruction algorithm. Therefore, sparsity based BLT reconstruction algorithm has a great potential. Here, we proposed a novel reconstruction method based on Bayesian compressive sensing and investigated its feasibility and effectiveness with a heterogeneous phantom. The results demonstrate the potential and merits of the proposed algorithm.
Sparse reconstruction using distribution agnostic bayesian matching pursuit
Masood, Mudassir
2013-11-01
A fast matching pursuit method using a Bayesian approach is introduced for sparse signal recovery. This method performs Bayesian estimates of sparse signals even when the signal prior is non-Gaussian or unknown. It is agnostic on signal statistics and utilizes a priori statistics of additive noise and the sparsity rate of the signal, which are shown to be easily estimated from data if not available. The method utilizes a greedy approach and order-recursive updates of its metrics to find the most dominant sparse supports to determine the approximate minimum mean-square error (MMSE) estimate of the sparse signal. Simulation results demonstrate the power and robustness of our proposed estimator. © 2013 IEEE.
Reconstruction of the insulin secretion rate by Bayesian deconvolution
DEFF Research Database (Denmark)
Andersen, Kim Emil; Højbjerre, Malene
of the insulin secretion rate (ISR) can be done by solving a highly ill-posed deconvolution problem. We present a Bayesian methodology for the estimation of scaled densities of phase-type distributions via Markov chain Monte Carlo techniques, whereby closed form evaluation of ISR is possible. We demonstrate...... the methodology on simulated data concluding that the method seems as a promising alternative to existing methods where the ISR is considered as piecewise constant....
Helical mode lung 4D-CT reconstruction using Bayesian model.
He, Tiancheng; Xue, Zhong; Nitsch, Paige L; Teh, Bin S; Wong, Stephen T
2013-01-01
4D computed tomography (CT) has been widely used for treatment planning of thoracic and abdominal cancer radiotherapy. Current 4D-CT lung image reconstruction methods rely on respiratory gating to rearrange the large number of axial images into different phases, which may be subject to external surrogate errors due to poor reproducibility of breathing cycles. New image-matching-based reconstruction works better for the cine mode of 4D-CT acquisition than the helical mode because the table position of each axial image is different in helical mode and image matching might suffer from bigger errors. In helical mode, not only the phases but also the un-uniform table positions of images need to be considered. We propose a Bayesian method for automated 4D-CT lung image reconstruction in helical mode 4D scans. Each axial image is assigned to a respiratory phase based on the Bayesian framework that ensures spatial and temporal smoothness of surfaces of anatomical structures. Iterative optimization is used to reconstruct a series of 3D-CT images for subjects undergoing 4D scans. In experiments, we compared visually and quantitatively the results of the proposed Bayesian 4D-CT reconstruction algorithm with the respiratory surrogate and the image matching-based method. The results showed that the proposed algorithm yielded better 4D-CT for helical scans.
Bayesian reconstruction of gravitational wave bursts using chirplets
Millhouse, Margaret; Cornish, Neil; Littenberg, Tyson
2017-01-01
The BayesWave algorithm has been shown to accurately reconstruct unmodeled short duration gravitational wave bursts and to distinguish between astrophysical signals and transient noise events. BayesWave does this by using a variable number of sine-Gaussian (Morlet) wavelets to reconstruct data in multiple interferometers. While the Morlet wavelets can be summed together to produce any possible waveform, there could be other wavelet functions that improve the performance. Because we expect most astrophysical gravitational wave signals to evolve in frequency, modified Morlet wavelets with linear frequency evolution - called chirplets - may better reconstruct signals with fewer wavelets. We compare the performance of BayesWave using Morlet wavelets and chirplets on a variety of simulated signals.
Institute of Scientific and Technical Information of China (English)
Gui-xia Liu; Wei Feng; Han Wang; Lei Liu; Chun-guang Zhou
2009-01-01
In the post-genomic biology era, the reconstruction of gene regulatory networks from microarray gene expression data is very important to understand the underlying biological system, and it has been a challenging task in bioinformatics. The Bayesian network model has been used in reconstructing the gene regulatory network for its advantages, but how to determine the network structure and parameters is still important to be explored. This paper proposes a two-stage structure learning algorithm which integrates immune evolution algorithm to build a Bayesian network .The new algorithm is evaluated with the use of both simulated and yeast cell cycle data. The experimental results indicate that the proposed algorithm can find many of the known real regulatory relationships from literature and predict the others unknown with high validity and accuracy.
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2009-01-01
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface......, and electrode positions. We first present a hierarchical Bayesian framework for EEG source localization that jointly performs source and forward model reconstruction (SOFOMORE). Secondly, we evaluate the SOFOMORE model by comparison with source reconstruction methods that use fixed forward models. Simulated...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....
Singh, Gurmeet; Raj, Ashish; Kressler, Bryan; Nguyen, Thanh D.; Spincemaille, Pascal; Zabih, Ramin; Wang, Yi
2010-01-01
Among recent parallel MR imaging reconstruction advances, a Bayesian method called Edge-preserving Parallel Imaging with GRAph cut Minimization (EPIGRAM) has been demonstrated to significantly improve signal to noise ratio (SNR) compared to conventional regularized sensitivity encoding (SENSE) method. However, EPIGRAM requires a large number of iterations in proportion to the number of intensity labels in the image, making it computationally expensive for high dynamic range images. The objective of this study is to develop a Fast EPIGRAM reconstruction based on the efficient binary jump move algorithm that provides a logarithmic reduction in reconstruction time while maintaining image quality. Preliminary in vivo validation of the proposed algorithm is presented for 2D cardiac cine MR imaging and 3D coronary MR angiography at acceleration factors of 2-4. Fast EPIGRAM was found to provide similar image quality to EPIGRAM and maintain the previously reported SNR improvement over regularized SENSE, while reducing EPIGRAM reconstruction time by 25-50 times. PMID:20939095
Directory of Open Access Journals (Sweden)
J. P. Werner
2015-03-01
Full Text Available Reconstructions of the late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurements of tree rings, ice cores, and varved lake sediments. Considerable advances could be achieved if time-uncertain proxies were able to be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches for accounting for time uncertainty are generally limited to repeating the reconstruction using each one of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here, we demonstrate how Bayesian hierarchical climate reconstruction models can be augmented to account for time-uncertain proxies. Critically, although a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age model probabilities decreases uncertainty in the resulting reconstructions, as compared with the current de facto standard of sampling over all age models, provided there is sufficient information from other data sources in the spatial region of the time-uncertain proxy. This approach can readily be generalized to non-layer-counted proxies, such as those derived from marine sediments.
Directory of Open Access Journals (Sweden)
J. P. Werner
2014-12-01
Full Text Available Reconstructions of late-Holocene climate rely heavily upon proxies that are assumed to be accurately dated by layer counting, such as measurement on tree rings, ice cores, and varved lake sediments. Considerable advances may be achievable if time uncertain proxies could be included within these multiproxy reconstructions, and if time uncertainties were recognized and correctly modeled for proxies commonly treated as free of age model errors. Current approaches to accounting for time uncertainty are generally limited to repeating the reconstruction using each of an ensemble of age models, thereby inflating the final estimated uncertainty – in effect, each possible age model is given equal weighting. Uncertainties can be reduced by exploiting the inferred space–time covariance structure of the climate to re-weight the possible age models. Here we demonstrate how Bayesian Hierarchical climate reconstruction models can be augmented to account for time uncertain proxies. Critically, while a priori all age models are given equal probability of being correct, the probabilities associated with the age models are formally updated within the Bayesian framework, thereby reducing uncertainties. Numerical experiments show that updating the age-model probabilities decreases uncertainty in the climate reconstruction, as compared with the current de-facto standard of sampling over all age models, provided there is sufficient information from other data sources in the region of the time-uncertain proxy. This approach can readily be generalized to non-layer counted proxies, such as those derived from marine sediments.
A Bayesian hierarchical nonhomogeneous hidden Markov model for multisite streamflow reconstructions
Bracken, C.; Rajagopalan, B.; Woodhouse, C.
2016-10-01
In many complex water supply systems, the next generation of water resources planning models will require simultaneous probabilistic streamflow inputs at multiple locations on an interconnected network. To make use of the valuable multicentury records provided by tree-ring data, reconstruction models must be able to produce appropriate multisite inputs. Existing streamflow reconstruction models typically focus on one site at a time, not addressing intersite dependencies and potentially misrepresenting uncertainty. To this end, we develop a model for multisite streamflow reconstruction with the ability to capture intersite correlations. The proposed model is a hierarchical Bayesian nonhomogeneous hidden Markov model (NHMM). A NHMM is fit to contemporary streamflow at each location using lognormal component distributions. Leading principal components of tree rings are used as covariates to model nonstationary transition probabilities and the parameters of the lognormal component distributions. Spatial dependence between sites is captured with a Gaussian elliptical copula. Parameters of the model are estimated in a fully Bayesian framework, in that marginal posterior distributions of all the parameters are obtained. The model is applied to reconstruct flows at 20 sites in the Upper Colorado River Basin (UCRB) from 1473 to 1906. Many previous reconstructions are available for this basin, making it ideal for testing this new method. The results show some improvements over regression-based methods in terms of validation statistics. Key advantages of the Bayesian NHMM over traditional approaches are a dynamic representation of uncertainty and the ability to make long multisite simulations that capture at-site statistics and spatial correlations between sites.
Shan, Chung-Lin
2014-01-01
In this paper, we extended our earlier work on the reconstruction of the (time-averaged) one-dimensional velocity distribution of Galactic Weakly Interacting Massive Particles (WIMPs) and introduce the Bayesian fitting procedure to the theoretically predicted velocity distribution functions. In this reconstruction process, the (rough) velocity distribution reconstructed by using raw data from direct Dark Matter detection experiments directly, i.e. measured recoil energies, with one or more different target materials, has been used as "reconstructed-input" information. By assuming a fitting velocity distribution function and scanning the parameter space based on the Bayesian analysis, the astronomical characteristic parameters, e.g. the Solar and Earth's orbital velocities, will be pinned down as the output results. Our Monte-Carlo simulations show that this Bayesian scanning procedure could reconstruct the true (input) WIMP velocity distribution function pretty precisely with negligible systematic deviations ...
A Bayesian Hierarchical Model for Reconstructing Sea Levels: From Raw Data to Rates of Change
Cahill, Niamh; Horton, Benjamin P; Parnell, Andrew C
2015-01-01
We present a holistic Bayesian hierarchical model for reconstructing the continuous and dynamic evolution of relative sea-level (RSL) change with fully quantified uncertainty. The reconstruction is produced from biological (foraminifera) and geochemical ({\\delta}13C) sea-level indicators preserved in dated cores of salt-marsh sediment. Our model is comprised of three modules: (1) A Bayesian transfer function for the calibration of foraminifera into tidal elevation, which is flexible enough to formally accommodate additional proxies (in this case bulk-sediment {\\delta}13C values); (2) A chronology developed from an existing Bchron age-depth model, and (3) An existing errors-in-variables integrated Gaussian process (EIV-IGP) model for estimating rates of sea-level change. We illustrate our approach using a case study of Common Era sea-level variability from New Jersey, U.S.A. We develop a new Bayesian transfer function (B-TF), with and without the {\\delta}13C proxy and compare our results to those from a widely...
Bayesian Analysis of Inflation III: Slow Roll Reconstruction Using Model Selection
Noreña, Jorge; Verde, Licia; Peiris, Hiranya V; Easther, Richard
2012-01-01
We implement Slow Roll Reconstruction -- an optimal solution to the inverse problem for inflationary cosmology -- within ModeCode, a publicly available solver for the inflationary dynamics. We obtain up-to-date constraints on the reconstructed inflationary potential, derived from the WMAP 7-year dataset and South Pole Telescope observations, combined with large scale structure data derived from SDSS Data Release 7. Using ModeCode in conjunction with the MultiNest sampler, we compute Bayesian evidence for the reconstructed potential at each order in the truncated slow roll hierarchy. We find that the data are well-described by the first two slow roll parameters, \\epsilon and \\eta, and that there is no need to include a nontrivial \\xi parameter.
BUMPER v1.0: a Bayesian user-friendly model for palaeo-environmental reconstruction
Holden, Philip B.; Birks, H. John B.; Brooks, Stephen J.; Bush, Mark B.; Hwang, Grace M.; Matthews-Bird, Frazer; Valencia, Bryan G.; van Woesik, Robert
2017-02-01
We describe the Bayesian user-friendly model for palaeo-environmental reconstruction (BUMPER), a Bayesian transfer function for inferring past climate and other environmental variables from microfossil assemblages. BUMPER is fully self-calibrating, straightforward to apply, and computationally fast, requiring ˜ 2 s to build a 100-taxon model from a 100-site training set on a standard personal computer. We apply the model's probabilistic framework to generate thousands of artificial training sets under ideal assumptions. We then use these to demonstrate the sensitivity of reconstructions to the characteristics of the training set, considering assemblage richness, taxon tolerances, and the number of training sites. We find that a useful guideline for the size of a training set is to provide, on average, at least 10 samples of each taxon. We demonstrate general applicability to real data, considering three different organism types (chironomids, diatoms, pollen) and different reconstructed variables. An identically configured model is used in each application, the only change being the input files that provide the training-set environment and taxon-count data. The performance of BUMPER is shown to be comparable with weighted average partial least squares (WAPLS) in each case. Additional artificial datasets are constructed with similar characteristics to the real data, and these are used to explore the reasons for the differing performances of the different training sets.
Dries, M.; Trager, S. C.; Koopmans, L. V. E.
2016-11-01
Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach, we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities and IMFs. When systematic uncertainties are not significant, we are able to reconstruct the input parameters that were used to create the mock populations. Our results show that if systematic uncertainties do play a role, this may introduce a bias on the results. Therefore, it is important to objectively compare different ingredients of SPS models. Through its Bayesian framework, our model is well suited for this.
Reconstructing the evolutionary history of natural languages
Energy Technology Data Exchange (ETDEWEB)
Warnow, T.; Ringe, D.; Taylor, A. [Univ. of Pennsylvania, Philadelphia, PA (United States)
1996-12-31
In this paper we present a new methodology for determining the evolutionary history of related languages. Our methodology uses linguistic information encoded as qualitative characters, and provides much greater precision than previous methods. Our analysis of Indo-European (IE) languages resolves questions that have troubled scholars for over a century.
Accurate reconstruction of insertion-deletion histories by statistical phylogenetics.
Directory of Open Access Journals (Sweden)
Oscar Westesson
Full Text Available The Multiple Sequence Alignment (MSA is a computational abstraction that represents a partial summary either of indel history, or of structural similarity. Taking the former view (indel history, it is possible to use formal automata theory to generalize the phylogenetic likelihood framework for finite substitution models (Dayhoff's probability matrices and Felsenstein's pruning algorithm to arbitrary-length sequences. In this paper, we report results of a simulation-based benchmark of several methods for reconstruction of indel history. The methods tested include a relatively new algorithm for statistical marginalization of MSAs that sums over a stochastically-sampled ensemble of the most probable evolutionary histories. For mammalian evolutionary parameters on several different trees, the single most likely history sampled by our algorithm appears less biased than histories reconstructed by other MSA methods. The algorithm can also be used for alignment-free inference, where the MSA is explicitly summed out of the analysis. As an illustration of our method, we discuss reconstruction of the evolutionary histories of human protein-coding genes.
Dries, M; Koopmans, L V E
2016-01-01
Recent studies based on the integrated light of distant galaxies suggest that the initial mass function (IMF) might not be universal. Variations of the IMF with galaxy type and/or formation time may have important consequences for our understanding of galaxy evolution. We have developed a new stellar population synthesis (SPS) code specifically designed to reconstruct the IMF. We implement a novel approach combining regularization with hierarchical Bayesian inference. Within this approach we use a parametrized IMF prior to regulate a direct inference of the IMF. This direct inference gives more freedom to the IMF and allows the model to deviate from parametrized models when demanded by the data. We use Markov Chain Monte Carlo sampling techniques to reconstruct the best parameters for the IMF prior, the age, and the metallicity of a single stellar population. We present our code and apply our model to a number of mock single stellar populations with different ages, metallicities, and IMFs. When systematic unc...
Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali
2015-01-01
In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 5123 to 81923 voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and Ht (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume "Shepp and Logan" in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections.
Multi-view TWRI scene reconstruction using a joint Bayesian sparse approximation model
Tang, V. H.; Bouzerdoum, A.; Phung, S. L.; Tivive, F. H. C.
2015-05-01
This paper addresses the problem of scene reconstruction in conjunction with wall-clutter mitigation for com- pressed multi-view through-the-wall radar imaging (TWRI). We consider the problem where the scene behind- the-wall is illuminated from different vantage points using a different set of frequencies at each antenna. First, a joint Bayesian sparse recovery model is employed to estimate the antenna signal coefficients simultaneously, by exploiting the sparsity and inter-signal correlations among antenna signals. Then, a subspace-projection technique is applied to suppress the signal coefficients related to the wall returns. Furthermore, a multi-task linear model is developed to relate the target coefficients to the image of the scene. The composite image is reconstructed using a joint Bayesian sparse framework, taking into account the inter-view dependencies. Experimental results are presented which demonstrate the effectiveness of the proposed approach for multi-view imaging of indoor scenes using a reduced set of measurements at each view.
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.
Zanini, Andrea; Woodbury, Allan D
2016-01-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
Lander, Tonya A; Klein, Etienne K; Oddou-Muratorio, Sylvie; Candau, Jean-Noël; Gidoin, Cindy; Chalon, Alain; Roig, Anne; Fallour, Delphine; Auger-Rozenberg, Marie-Anne; Boivin, Thomas
2014-12-01
Understanding how invasive species establish and spread is vital for developing effective management strategies for invaded areas and identifying new areas where the risk of invasion is highest. We investigated the explanatory power of dispersal histories reconstructed based on local-scale wind data and a regional-scale wind-dispersed particle trajectory model for the invasive seed chalcid wasp Megastigmus schimitscheki (Hymenoptera: Torymidae) in France. The explanatory power was tested by: (1) survival analysis of empirical data on M. schimitscheki presence, absence and year of arrival at 52 stands of the wasp's obligate hosts, Cedrus (true cedar trees); and (2) Approximate Bayesian analysis of M. schimitscheki genetic data using a coalescence model. The Bayesian demographic modeling and traditional population genetic analysis suggested that initial invasion across the range was the result of long-distance dispersal from the longest established sites. The survival analyses of the windborne expansion patterns derived from a particle dispersal model indicated that there was an informative correlation between the M. schimitscheki presence/absence data from the annual surveys and the scenarios based on regional-scale wind data. These three very different analyses produced highly congruent results supporting our proposal that wind is the most probable vector for passive long-distance dispersal of this invasive seed wasp. This result confirms that long-distance dispersal from introduction areas is a likely driver of secondary expansion of alien invasive species. Based on our results, management programs for this and other windborne invasive species may consider (1) focusing effort at the longest established sites and (2) monitoring outlying populations remains critically important due to their influence on rates of spread. We also suggest that there is a distinct need for new analysis methods that have the capacity to combine empirical spatiotemporal field data
Bayesian Methods for Reconstructing Sunspot Numbers Before and During the Maunder Minimum
Travaglini, Guido
2017-01-01
The Maunder Minimum (MM) was an extended period of reduced solar activity in terms of yearly sunspot numbers (SSN) during 1610 - 1715. The reality of this "grand minimum" is generally accepted in the scientific community, but the statistics of the SSN record suggest a need for data reconstruction. The MM data show a nonstandard distribution compared with the entire SSN signal (1610 - 2014). The pattern does not satisfy the weakly stationary solar dynamo approximation, which characterizes many natural events spanning centuries or even millennia, including the Sun and the stars. Over the entire observation period (1610 - 2014), the reported SSN exhibits statistically significant regime switches, departures from autoregressive stationarity, and growing trends. Reconstruction of the SSN during the pre-MM and MM periods is performed using five novel statistical procedures in support of signal analysis. A Bayesian-Monte Carlo backcast technique is found to be most reliable and produces an SSN signal that meets the weak-stationarity requirement. The computed MM signal for this reconstruction does not show a "grand" minimum or even a "semi-grand" minimum.
A Brief History of Anterior Cruciate Ligament Reconstruction
Directory of Open Access Journals (Sweden)
Nikolaos Davarinos
2014-01-01
Full Text Available Reconstructions of the anterior cruciate ligament (ACL are among the most frequently performed procedures in knee surgery nowadays. The history of ACL surgery can be traced as far back as the Egyptian times. The early years reflect the efforts to establish a viable, consistently successful reconstruction technique while, during the early 20th century, we witness an increasing awareness of, and interest in, the ligament and its lesions. Finally, we highlight the most important steps in the evolution of the ACL reconstruction surgery by discussing the various techniques spanning the years using not only autologous grafts (fascia lata, meniscal, hamstring, patella tendon, bone-patella tendon-bone, and double bundle grafts but also synthetic ones and allografts.
Energy Technology Data Exchange (ETDEWEB)
Wang, Li; Gac, Nicolas; Mohammad-Djafari, Ali [Laboratoire des Signaux et Systèmes 3, Rue Joliot-Curie 91192 Gif sur Yvette (France)
2015-01-13
In order to improve quality of 3D X-ray tomography reconstruction for Non Destructive Testing (NDT), we investigate in this paper hierarchical Bayesian methods. In NDT, useful prior information on the volume like the limited number of materials or the presence of homogeneous area can be included in the iterative reconstruction algorithms. In hierarchical Bayesian methods, not only the volume is estimated thanks to the prior model of the volume but also the hyper parameters of this prior. This additional complexity in the reconstruction methods when applied to large volumes (from 512{sup 3} to 8192{sup 3} voxels) results in an increasing computational cost. To reduce it, the hierarchical Bayesian methods investigated in this paper lead to an algorithm acceleration by Variational Bayesian Approximation (VBA) [1] and hardware acceleration thanks to projection and back-projection operators paralleled on many core processors like GPU [2]. In this paper, we will consider a Student-t prior on the gradient of the image implemented in a hierarchical way [3, 4, 1]. Operators H (forward or projection) and H{sup t} (adjoint or back-projection) implanted in multi-GPU [2] have been used in this study. Different methods will be evalued on synthetic volume 'Shepp and Logan' in terms of quality and time of reconstruction. We used several simple regularizations of order 1 and order 2. Other prior models also exists [5]. Sometimes for a discrete image, we can do the segmentation and reconstruction at the same time, then the reconstruction can be done with less projections.
Reconstructing medical history: historiographical features, approaches and challenges.
Conti, A A
2011-01-01
Medical historiography deals with the concepts, theories, and approaches adopted in the reconstruction and discussion of the history of medicine. The expression has changed through time and according to different scholars and contexts, and it largely depends on the general standpoint from which the medicine of the past is examined. From an Evidence Based History of Medicine perspective, an accurate and complete examination of all available sources must be carried out to draw a picture of the medical theme examined, and, to reach this aim, the issue of the reliability of sources is a preliminary point to take into account. Different historiographical models adopted in the twentieth century will be discussed in this paper. The current ample discussion on the characterising features, methods and challenges of medical historiography documents the wide extent of the debate on the ways available today for the reconstruction of medical history. It also testifies to the relevance, inter-disciplinarity and remarkable vitality of the topic in current academic, scientific and social contexts. Medical and health history is an essential part of current medicine, and the study of the development of medicine through time is an extremely formative experience, which should not be confined to historians and professionals, but which, in appropriate formats and in correct methodological terms, should have full right of citizenship in current health care initiatives.
Directory of Open Access Journals (Sweden)
Chen Yidong
2011-10-01
Full Text Available Abstract Background Transcriptional regulation by transcription factor (TF controls the time and abundance of mRNA transcription. Due to the limitation of current proteomics technologies, large scale measurements of protein level activities of TFs is usually infeasible, making computational reconstruction of transcriptional regulatory network a difficult task. Results We proposed here a novel Bayesian non-negative factor model for TF mediated regulatory networks. Particularly, the non-negative TF activities and sample clustering effect are modeled as the factors from a Dirichlet process mixture of rectified Gaussian distributions, and the sparse regulatory coefficients are modeled as the loadings from a sparse distribution that constrains its sparsity using knowledge from database; meantime, a Gibbs sampling solution was developed to infer the underlying network structure and the unknown TF activities simultaneously. The developed approach has been applied to simulated system and breast cancer gene expression data. Result shows that, the proposed method was able to systematically uncover TF mediated transcriptional regulatory network structure, the regulatory coefficients, the TF protein level activities and the sample clustering effect. The regulation target prediction result is highly coordinated with the prior knowledge, and sample clustering result shows superior performance over previous molecular based clustering method. Conclusions The results demonstrated the validity and effectiveness of the proposed approach in reconstructing transcriptional networks mediated by TFs through simulated systems and real data.
Bayesian network reconstruction using systems genetics data: comparison of MCMC methods.
Tasaki, Shinya; Sauerwine, Ben; Hoff, Bruce; Toyoshiba, Hiroyoshi; Gaiteri, Chris; Chaibub Neto, Elias
2015-04-01
Reconstructing biological networks using high-throughput technologies has the potential to produce condition-specific interactomes. But are these reconstructed networks a reliable source of biological interactions? Do some network inference methods offer dramatically improved performance on certain types of networks? To facilitate the use of network inference methods in systems biology, we report a large-scale simulation study comparing the ability of Markov chain Monte Carlo (MCMC) samplers to reverse engineer Bayesian networks. The MCMC samplers we investigated included foundational and state-of-the-art Metropolis-Hastings and Gibbs sampling approaches, as well as novel samplers we have designed. To enable a comprehensive comparison, we simulated gene expression and genetics data from known network structures under a range of biologically plausible scenarios. We examine the overall quality of network inference via different methods, as well as how their performance is affected by network characteristics. Our simulations reveal that network size, edge density, and strength of gene-to-gene signaling are major parameters that differentiate the performance of various samplers. Specifically, more recent samplers including our novel methods outperform traditional samplers for highly interconnected large networks with strong gene-to-gene signaling. Our newly developed samplers show comparable or superior performance to the top existing methods. Moreover, this performance gain is strongest in networks with biologically oriented topology, which indicates that our novel samplers are suitable for inferring biological networks. The performance of MCMC samplers in this simulation framework can guide the choice of methods for network reconstruction using systems genetics data.
Energy Technology Data Exchange (ETDEWEB)
Irishkin, M. [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France); Imbeaux, F., E-mail: frederic.imbeaux@cea.fr [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France); Aniel, T.; Artaud, J.F. [CEA, IRFM, F-13108 Saint-Paul-Lez-Durance (France)
2015-11-15
Highlights: • We developed a method for automated comparison of experimental data with models. • A unique platform implements Bayesian analysis and integrated modelling tools. • The method is tokamak-generic and is applied to Tore Supra and JET pulses. • Validation of a heat transport model is carried out. • We quantified the uncertainties due to Te profiles in current diffusion simulations. - Abstract: In the context of present and future long pulse tokamak experiments yielding a growing size of measured data per pulse, automating data consistency analysis and comparisons of measurements with models is a critical matter. To address these issues, the present work describes an expert system that carries out in an integrated and fully automated way (i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis (ii) a prediction of the reconstructed quantities, according to some models and (iii) a comparison of the first two steps. The first application shown is devoted to the development of an automated comparison method between the experimental plasma profiles reconstructed using Bayesian methods and time dependent solutions of the transport equations. The method was applied to model validation of a simple heat transport model with three radial shape options. It has been tested on a database of 21 Tore Supra and 14 JET shots. The second application aims at quantifying uncertainties due to the electron temperature profile in current diffusion simulations. A systematic reconstruction of the Ne, Te, Ti profiles was first carried out for all time slices of the pulse. The Bayesian 95% highest probability intervals on the Te profile reconstruction were then used for (i) data consistency check of the flux consumption and (ii) defining a confidence interval for the current profile simulation. The method has been applied to one Tore Supra pulse and one JET pulse.
Bai, Ying; Lan, JieQin; Gao, WeiWei
2016-01-01
A toy detector array has been designed to simulate the detection of cosmic rays in Extended Air Shower(EAS) Experiments for ground-based TeV Astrophysics. The primary energies of protons from the Monte-Carlo simulation have been reconstructed by the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment\\cite{lhaaso-ma}, respectively. The result of the energy reconstruction using BNNs has been compared with the one using the standard method. Compared to the standard method, the energy resolutions are significantly improved using BNNs. And the improvement is more obvious for the high energy protons than the low energy ones.
Rowley, Lisa M; Bradley, Kevin M; Boardman, Philip; Hallam, Aida; McGowan, Daniel R
2016-09-29
Imaging on a gamma camera with Yttrium-90 ((90)Y) following selective internal radiotherapy (SIRT) may allow for verification of treatment delivery but suffers relatively poor spatial resolution and imprecise dosimetry calculation. (90)Y Positron Emission Tomography (PET) / Computed Tomography (CT) imaging is possible on 3D, time-of-flight machines however images are usually poor due to low count statistics and noise. A new PET reconstruction software using a Bayesian penalized likelihood (BPL) reconstruction algorithm (termed Q.Clear) released by GE was investigated using phantom and patient scans to optimize the reconstruction for post-SIRT imaging and clarify if this leads to an improvement in clinical image quality using (90)Y.
Van Nguyen, Linh; Chainais, Pierre
2015-01-01
The study of turbulent flows calls for measurements with high resolution both in space and in time. We propose a new approach to reconstruct High-Temporal-High-Spatial resolution velocity fields by combining two sources of information that are well-resolved either in space or in time, the Low-Temporal-High-Spatial (LTHS) and the High-Temporal-Low-Spatial (HTLS) resolution measurements. In the framework of co-conception between sensing and data post-processing, this work extensively investigates a Bayesian reconstruction approach using a simulated database. A Bayesian fusion model is developed to solve the inverse problem of data reconstruction. The model uses a Maximum A Posteriori estimate, which yields the most probable field knowing the measurements. The DNS of a wall-bounded turbulent flow at moderate Reynolds number is used to validate and assess the performances of the present approach. Low resolution measurements are subsampled in time and space from the fully resolved data. Reconstructed velocities ar...
Lander, Tonya A; Oddou-Muratorio, Sylvie; Prouillet-Leplat, Helene; Klein, Etienne K
2011-12-01
Range expansion and contraction has occurred in the history of most species and can seriously impact patterns of genetic diversity. Historical data about range change are rare and generally appropriate for studies at large scales, whereas the individual pollen and seed dispersal events that form the basis of geneflow and colonization generally occur at a local scale. In this study, we investigated range change in Fagus sylvatica on Mont Ventoux, France, using historical data from 1838 to the present and approximate Bayesian computation (ABC) analyses of genetic data. From the historical data, we identified a population minimum in 1845 and located remnant populations at least 200 years old. The ABC analysis selected a demographic scenario with three populations, corresponding to two remnant populations and one area of recent expansion. It also identified expansion from a smaller ancestral population but did not find that this expansion followed a population bottleneck, as suggested by the historical data. Despite a strong support to the selected scenario for our data set, the ABC approach showed a low power to discriminate among scenarios on average and a low ability to accurately estimate effective population sizes and divergence dates, probably due to the temporal scale of the study. This study provides an unusual opportunity to test ABC analysis in a system with a well-documented demographic history and identify discrepancies between the results of historical, classical population genetic and ABC analyses. The results also provide valuable insights into genetic processes at work at a fine spatial and temporal scale in range change and colonization.
Reconstructing the population genetic history of the Caribbean.
Directory of Open Access Journals (Sweden)
Andrés Moreno-Estrada
2013-11-01
Full Text Available The Caribbean basin is home to some of the most complex interactions in recent history among previously diverged human populations. Here, we investigate the population genetic history of this region by characterizing patterns of genome-wide variation among 330 individuals from three of the Greater Antilles (Cuba, Puerto Rico, Hispaniola, two mainland (Honduras, Colombia, and three Native South American (Yukpa, Bari, and Warao populations. We combine these data with a unique database of genomic variation in over 3,000 individuals from diverse European, African, and Native American populations. We use local ancestry inference and tract length distributions to test different demographic scenarios for the pre- and post-colonial history of the region. We develop a novel ancestry-specific PCA (ASPCA method to reconstruct the sub-continental origin of Native American, European, and African haplotypes from admixed genomes. We find that the most likely source of the indigenous ancestry in Caribbean islanders is a Native South American component shared among inland Amazonian tribes, Central America, and the Yucatan peninsula, suggesting extensive gene flow across the Caribbean in pre-Columbian times. We find evidence of two pulses of African migration. The first pulse--which today is reflected by shorter, older ancestry tracts--consists of a genetic component more similar to coastal West African regions involved in early stages of the trans-Atlantic slave trade. The second pulse--reflected by longer, younger tracts--is more similar to present-day West-Central African populations, supporting historical records of later transatlantic deportation. Surprisingly, we also identify a Latino-specific European component that has significantly diverged from its parental Iberian source populations, presumably as a result of small European founder population size. We demonstrate that the ancestral components in admixed genomes can be traced back to distinct sub
Bai, Y.; Xu, Y.; Pan, J.; Lan, J. Q.; Gao, W. W.
2016-07-01
A toy detector array is designed to detect a shower generated by the interaction between a TeV cosmic ray and the atmosphere. In the present paper, the primary energies of showers detected by the detector array are reconstructed with the algorithm of Bayesian neural networks (BNNs) and a standard method like the LHAASO experiment [1], respectively. Compared to the standard method, the energy resolutions are significantly improved using the BNNs. And the improvement is more obvious for the high energy showers than the low energy ones.
Shiraki, Yoshifumi; Kabashima, Yoshiyuki
2016-06-01
A signal model called joint sparse model 2 (JSM-2) or the multiple measurement vector problem, in which all sparse signals share their support, is important for dealing with practical signal processing problems. In this paper, we investigate the typical reconstruction performance of noisy measurement JSM-2 problems for {{\\ell}2,1} -norm regularized least square reconstruction and the Bayesian optimal reconstruction scheme in terms of mean square error. Employing the replica method, we show that these schemes, which exploit the knowledge of the sharing of the signal support, can recover the signals more precisely as the number of channels increases. In addition, we compare the reconstruction performance of two different ensembles of observation matrices: one is composed of independent and identically distributed random Gaussian entries and the other is designed so that row vectors are orthogonal to one another. As reported for the single-channel case in earlier studies, our analysis indicates that the latter ensemble offers better performance than the former ones for the noisy JSM-2 problem. The results of numerical experiments with a computationally feasible approximation algorithm we developed for this study agree with the theoretical estimation.
Chizhova, M.; Korovin, D.; Gurianov, A.; Brodovskii, M.; Brunn, A.; Stilla, U.; Luhmann, T.
2017-02-01
The point cloud interpretation and reconstruction of 3d-buildings from point clouds has already been treated for a few decades. There are many articles which consider the different methods and workows of the automatic detection and reconstruction of geometrical objects from point clouds. Each method is suitable for the special geometry type of object or sensor. General approaches are rare. In our work we present an algorithm which develops the optimal process sequence of the automatic search, detection and reconstruction of buildings and building components from a point cloud. It can be used for the detection of the set of geometric objects to be reconstructed, independent of its destruction. In a simulated example we reconstruct a complete Russian-orthodox church starting from the set of detected structural components and reconstruct missing components with high probability.
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Burgess, J Michael; Greiner, Jochen; Mortlock, Daniel J
2016-01-01
The accurate spatial location of gamma-ray bursts (GRBs) is crucial for both producing a detector response matrix (DRM) and follow-up observations by other instruments. The Fermi Gamma-ray Burst Monitor (GBM) has the largest field of view (FOV) for detecting GRBs as it views the entire unocculted sky, but as a non-imaging instrument it relies on the relative count rates observed in each of its 14 detectors to localize transients. Improving its ability to accurately locate GRBs and other transients is vital to the paradigm of multi-messenger astronomy, including the electromagnetic follow-up of gravitational wave signals. Here we present the BAyesian Location Reconstruction Of GRBs ({\\tt BALROG}) method for localizing and characterising GBM transients. Our approach eliminates the systematics of previous approaches by simultaneously fitting for the location and spectrum of a source. It also correctly incorporates the uncertainties in the location of a transient into the spectral parameters and produces reliable...
Soon, Villu; Saarma, Urmas
2011-07-01
The ignita species group within the genus Chrysis includes over 100 cuckoo wasp species, which all lead a parasitic lifestyle and exhibit very similar morphology. The lack of robust, diagnostic morphological characters has hindered phylogenetic reconstructions and contributed to frequent misidentification and inconsistent interpretations of species in this group. Therefore, molecular phylogenetic analysis is the most suitable approach for resolving the phylogeny and taxonomy of this group. We present a well-resolved phylogeny of the Chrysis ignita species group based on mitochondrial sequence data from 41 ingroup and six outgroup taxa. Although our emphasis was on European taxa, we included samples from most of the distribution range of the C. ignita species group to test for monophyly. We used a continuous mitochondrial DNA sequence consisting of 16S rRNA, tRNA(Val), 12S rRNA and ND4. The location of the ND4 gene at the 3' end of this continuous sequence, following 12S rRNA, represents a novel mitochondrial gene arrangement for insects. Due to difficulties in aligning rRNA genes, two different Bayesian approaches were employed to reconstruct phylogeny: (1) using a reduced data matrix including only those positions that could be aligned with confidence; or (2) using the full sequence dataset while estimating alignment and phylogeny simultaneously. In addition maximum-parsimony and maximum-likelihood analyses were performed to test the robustness of the Bayesian approaches. Although all approaches yielded trees with similar topology, considerably more nodes were resolved with analyses using the full data matrix. Phylogenetic analysis supported the monophyly of the C. ignita species group and divided its species into well-supported clades. The resultant phylogeny was only partly in accordance with published subgroupings based on morphology. Our results suggest that several taxa currently treated as subspecies or names treated as synonyms may in fact constitute
Reconstructing the insulin secretion rate by Bayesian deconvolution of phase-type densities
DEFF Research Database (Denmark)
Andersen, Kim Emil; Højbjerre, Malene
2005-01-01
of the insulin secretion rate (ISR) can be done by solving a highly ill-posed deconvolution problem. We represent the ISR, the C-peptide concentration and the convolution kernel as scaled phase-type densities and develop a Bayesian methodology for estimating such densities via Markov chain Monte Carlo techniques....... Hereby closed form evaluation of ISR is possible. We demonstrate the methodology on experimental data from healthy subjects and obtain results which are more realistic than recently reported conclusions based upon methods where the ISR is considered as piecewise constant....
Meseguer, Andrea Sánchez; Aldasoro, Juan Jose; Sanmartín, Isabel
2013-05-01
The genus Hypericum L. ("St. John's wort", Hypericaceae) comprises nearly 500 species of shrubs, trees and herbs distributed mainly in temperate regions of the Northern Hemisphere, but also in high-altitude tropical and subtropical areas. Until now, molecular phylogenetic hypotheses on infra-generic relationships have been based solely on the nuclear marker ITS. Here, we used a full Bayesian approach to simultaneously reconstruct phylogenetic relationships, divergence times, and patterns of morphological and range evolution in Hypericum, using nuclear (ITS) and plastid DNA sequences (psbA-trnH, trnS-trnG, trnL-trnF) of 186 species representing 33 of the 36 described morphological sections. Consistent with other studies, we found that corrections of the branch length prior helped recover more realistic branch lengths in by-gene partitioned Bayesian analyses, but the effect was also seen within single genes if the overall mutation rate differed considerably among sites or regions. Our study confirms that Hypericum is not monophyletic with the genus Triadenum embedded within, and rejects the traditional infrageneric classification, with many sections being para- or polyphyletic. The small Western Palearctic sections Elodes and Adenotrias are the sister-group of a geographic dichotomy between a mainly New World clade and a large Old World clade. Bayesian reconstruction of morphological character states and range evolution show a complex pattern of morphological plasticity and inter-continental movement within the genus. The ancestors of Hypericum were probably tropical shrubs that migrated from Africa to the Palearctic in the Early Tertiary, concurrent with the expansion of tropical climates in northern latitudes. Global climate cooling from the Mid Tertiary onwards might have promoted adaptation to temperate conditions in some lineages, such as the development of the herbaceous habit or unspecialized corollas.
Reconstruction of large-scale gene regulatory networks using Bayesian model averaging.
Kim, Haseong; Gelenbe, Erol
2012-09-01
Gene regulatory networks provide the systematic view of molecular interactions in a complex living system. However, constructing large-scale gene regulatory networks is one of the most challenging problems in systems biology. Also large burst sets of biological data require a proper integration technique for reliable gene regulatory network construction. Here we present a new reverse engineering approach based on Bayesian model averaging which attempts to combine all the appropriate models describing interactions among genes. This Bayesian approach with a prior based on the Gibbs distribution provides an efficient means to integrate multiple sources of biological data. In a simulation study with maximum of 2000 genes, our method shows better sensitivity than previous elastic-net and Gaussian graphical models, with a fixed specificity of 0.99. The study also shows that the proposed method outperforms the other standard methods for a DREAM dataset generated by nonlinear stochastic models. In brain tumor data analysis, three large-scale networks consisting of 4422 genes were built using the gene expression of non-tumor, low and high grade tumor mRNA expression samples, along with DNA-protein binding affinity information. We found that genes having a large variation of degree distribution among the three tumor networks are the ones that see most involved in regulatory and developmental processes, which possibly gives a novel insight concerning conventional differentially expressed gene analysis.
Directory of Open Access Journals (Sweden)
Brandon Lee Drake
Full Text Available Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts.In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studiesusing these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the West [corrected]. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated and Bayesian methods (to address uncertainty in geochemical source attribution. It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous.
Drake, Brandon Lee; Wills, Wirt H.; Hamilton, Marian I.; Dorshow, Wetherbee
2014-01-01
Strontium isotope sourcing has become a common and useful method for assigning sources to archaeological artifacts. In Chaco Canyon, an Ancestral Pueblo regional center in New Mexico, previous studies using these methods have suggested that significant portion of maize and wood originate in the Chuska Mountains region, 75 km to the East. In the present manuscript, these results were tested using both frequentist methods (to determine if geochemical sources can truly be differentiated) and Bayesian methods (to address uncertainty in geochemical source attribution). It was found that Chaco Canyon and the Chuska Mountain region are not easily distinguishable based on radiogenic strontium isotope values. The strontium profiles of many geochemical sources in the region overlap, making it difficult to definitively identify any one particular geochemical source for the canyon's pre-historic maize. Bayesian mixing models support the argument that some spruce and fir wood originated in the San Mateo Mountains, but that this cannot explain all 87Sr/86Sr values in Chaco timber. Overall radiogenic strontium isotope data do not clearly identify a single major geochemical source for maize, ponderosa, and most spruce/fir timber. As such, the degree to which Chaco Canyon relied upon outside support for both food and construction material is still ambiguous. PMID:24854352
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2011-01-01
We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical s...
WHOOMP! (There It Is) Rapid Bayesian position reconstruction for gravitational-wave transients
Singer, Leo P
2015-01-01
Within the next few years, Advanced LIGO and Virgo should detect gravitational waves (GWs) from binary neutron star and neutron star-black hole mergers. These sources are also predicted to power a broad array of electromagnetic transients. Because the X-ray and optical signatures can be faint and fade rapidly, observing them hinges on rapidly inferring the sky location from the gravitational wave observations. Markov chain Monte Carlo (MCMC) methods for gravitational-wave parameter estimation can take hours or more. We introduce BAYESTAR, a rapid, Bayesian, non-MCMC sky localization algorithm that takes just seconds to produce probability sky maps that are comparable in accuracy to the full analysis. Prompt localizations from BAYESTAR will make it possible to search electromagnetic counterparts of compact binary mergers.
DEFF Research Database (Denmark)
Heller, Rasmus; Chikhi, Lounes; Siegismund, Hans
2013-01-01
when it is violated. Among the most widely applied demographic inference methods are Bayesian skyline plots (BSPs), which are used across a range of biological fields. Violations of the panmixia assumption are to be expected in many biological systems, but the consequences for skyline plot inferences...
Reconstructing a School's Past Using Oral Histories and GIS Mapping.
Alibrandi, Marsha; Beal, Candy; Thompson, Ann; Wilson, Anna
2000-01-01
Describes an interdisciplinary project that incorporated language arts, social studies, instructional technology, and science where middle school students were involved in oral history, Geographic Information System (GIS) mapping, architectural research, the science of dendrochronology, and the creation of an archival school Web site. (CMK)
Education in Somalia: History, Destruction, and Calls for Reconstruction.
Abdi, Ali A.
1998-01-01
Traces the history of education in Somalia: in precolonial traditional Somalia; during colonial rule by Italy; under civilian rule, 1960-69; and under military rule, 1969-90. Describes the total destruction of the education system since the 1991 collapse of the state, widespread illiteracy and adolescent involvement in thuggery, and the urgent…
Free Radicals in Organic Matter for Thermal History Reconstruction of Carbonate Succession
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Geothermometer is one of the most useful methods to reconstruct the thermal history of sedimentary basins. This paper introduces the application of free radicals concentration of organic matter as a thermal indicator in the thermal history reconstruction of carbonate succession, based on anhydrous thermal simulation results of type Ⅰ and Ⅱ1 kerogen. A series of free radicals data are obtained under thermal simulation of different heating temperatures and times, and quantitative models between free radical concentration (Ng) of organic matter and time-temperature index (TTI) for types Ⅰ and type Ⅱ1 kerogen are also obtained. This Ng- TTI relation was used to model the Ordovician thermal gradients of Well TZ12 in the Tarim Basin. The modeling result is corresponding to the results obtained by apatite fission track data and published data. This new method of thermal history reconstruction will be benefit to the hydrocarbon generation and accumulation study and resource assessment of carbonate succession.
Frey, Jordan D; Alperovich, Michael; Levine, Jamie P; Choi, Mihye; Karp, Nolan S
2017-01-18
History of smoking has been implicated as a risk factor for reconstructive complications in nipple-sparing mastectomy (NSM), however there have been no direct analyses of outcomes in smokers and nonsmokers. All patients undergoing NSM at New York University Langone Medical Center from 2006 to 2014 were identified. Outcomes were compared for those with and without a smoking history and stratified by pack-year smoking history and years-to-quitting (YTQ). A total of 543 nipple-sparing mastectomies were performed from 2006 to 2014 with a total of 49 in patients with a history of smoking. Reconstructive outcomes in NSM between those with and without a smoking history were equivalent. Those with a smoking history were not significantly more likely to have mastectomy flap necrosis (p = 0.6251), partial (p = 0.8564), or complete (p = 0.3365) nipple-areola complex (NAC) necrosis. Likewise, active smokers alone did not have a higher risk of complications compared to nonsmokers or those with smoking history. Comparing nonsmokers and those with a less or greater than 10 pack-year smoking history, those with a > 10 pack-year history had significantly more complete NAC necrosis (p = 0.0114, 5 YTQ prior to NSM were equivalent to those without a smoking history. We demonstrate that NSM may be safely offered to those with a smoking history although a > 10 pack-year smoking history or <5 YTQ prior to NSM may impart a higher risk of reconstructive complications, including complete NAC necrosis.
Lattice NRQCD study on in-medium bottomonium spectra using a novel Bayesian reconstruction approach
Energy Technology Data Exchange (ETDEWEB)
Kim, Seyong [Department of Physics, Sejong University, Seoul 143-747 (Korea, Republic of); Petreczky, Peter [Physics Department, Brookhaven National Laboratory, Upton, NY 11973 (United States); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, 69120 Heidelberg (Germany)
2016-01-22
We present recent results on the in-medium modification of S- and P-wave bottomonium states around the deconfinement transition. Our study uses lattice QCD with N{sub f} = 2 + 1 light quark flavors to describe the non-perturbative thermal QCD medium between 140MeV < T < 249MeV and deploys lattice regularized non-relativistic QCD (NRQCD) effective field theory to capture the physics of heavy quark bound states immersed therein. The spectral functions of the {sup 3}S{sub 1} (ϒ) and {sup 3}P{sub 1} (χ{sub b1}) bottomonium states are extracted from Euclidean time Monte Carlo simulations using a novel Bayesian prescription, which provides higher accuracy than the Maximum Entropy Method. Based on a systematic comparison of interacting and free spectral functions we conclude that the ground states of both the S-wave (ϒ) and P-wave (χ{sub b1}) channel survive up to T = 249MeV. Stringent upper limits on the size of the in-medium modification of bottomonium masses and widths are provided.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Bayesian Framework with Non-local and Low-rank Constraint for Image Reconstruction
Tang, Zhonghe; Wang, Shengzhe; Huo, Jianliang; Guo, Hang; Zhao, Haibo; Mei, Yuan
2017-01-01
Built upon the similar methodology of 'grouping and collaboratively filtering', the proposed algorithm recovers image patches from the array of similar noisy patches based on the assumption that their noise-free versions or approximation lie in a low dimensional subspace and has a low rank. Based on the analysis of the effect of noise and perturbation on the singular value, a weighted nuclear norm is defined to replace the conventional nuclear norm. Corresponding low-rank decomposition model and singular value shrinkage operator are derived. Taking into account the difference between the distribution of the signal and the noise, the weight depends not only on the standard deviation of noise, but also on the rank of the noise-free matrix and the singular value itself. Experimental results in image reconstruction tasks show that at relatively low computational cost the performance of proposed method is very close to state-of-the-art reconstruction methods BM3D and LSSC even outperforms them in restoring and preserving structure
Directory of Open Access Journals (Sweden)
Alexander Tilley
Full Text Available The trophic ecology of epibenthic mesopredators is not well understood in terms of prey partitioning with sympatric elasmobranchs or their effects on prey communities, yet the importance of omnivores in community trophic dynamics is being increasingly realised. This study used stable isotope analysis of (15N and (13C to model diet composition of wild southern stingrays Dasyatis americana and compare trophic niche space to nurse sharks Ginglymostoma cirratum and Caribbean reef sharks Carcharhinus perezi on Glovers Reef Atoll, Belize. Bayesian stable isotope mixing models were used to investigate prey choice as well as viable Diet-Tissue Discrimination Factors for use with stingrays. Stingray δ(15N values showed the greatest variation and a positive relationship with size, with an isotopic niche width approximately twice that of sympatric species. Shark species exhibited comparatively restricted δ(15N values and greater δ(13C variation, with very little overlap of stingray niche space. Mixing models suggest bivalves and annelids are proportionally more important prey in the stingray diet than crustaceans and teleosts at Glovers Reef, in contrast to all but one published diet study using stomach contents from other locations. Incorporating gut contents information from the literature, we suggest diet-tissue discrimination factors values of Δ(15N ≈ 2.7‰ and Δ(13C ≈ 0.9‰ for stingrays in the absence of validation experiments. The wide trophic niche and lower trophic level exhibited by stingrays compared to sympatric sharks supports their putative role as important base stabilisers in benthic systems, with the potential to absorb trophic perturbations through numerous opportunistic prey interactions.
Aguayo, M.; Marshall, H.; McNamara, J. P.; Mead, J.; Flores, A. N.
2013-12-01
Estimation of snowpack parameters such as depth, density and grain structure is a central focus of hydrology in seasonally snow-covered lands. These parameters are directly estimated by field observations, indirectly estimated from other parameters using statistical correlations, or simulated with a model. Difficulty in sampling thin layers and uncertainty in the transition between layers can cause significant uncertainty in measurements of these parameters. Snow density is one of the most important parameters to measure because it is strictly related with snow water content, an important component of the global water balance. We develop a mathematical framework to estimate snow density from measurements of temperature and thickness of snowpack layers over a particular time period, in conjunction with a physics-based model of snowpack evolution. We formulate a Bayesian approach to estimate the snowpack density profile, using a full range of possible simulations that incorporate key sources of uncertainty to build in prior snowpack knowledge. The posterior probability density function of the snow density, conditioned on snowpack temperature measurements, is computed by multiplying the likelihoods and assumed prior distribution function. Random sampling is used to generate a range of densities with same probability when prior uniform probability function is assumed. A posterior probability density function calculated directly via Bayes' theorem is used to calculate the probability of every sample generated. The forward model is a 1D, multilayer snow energy and mass balance model, which solves for snow temperature, density, and liquid water content on a finite element mesh. The surface and ground temperature data of snowpack (boundary conditions), are provided by the Center for Snow and Avalanche Studies (CSAS), Silverton CO, from snow pits made at Swamp Angel and Senator Beck study plot sites. Standard errors between field observations and results computed denote the
Chang, Joshua C; Chou, Tom
2015-01-01
Quantifying the forces between and within macromolecules is a necessary first step in understanding the mechanics of molecular structure, protein folding, and enzyme function and performance. In such macromolecular settings, dynamic single-molecule force spectroscopy (DFS) has been used to distort bonds. The resulting responses, in the form of rupture forces, work applied, and trajectories of displacements, have been used to reconstruct bond potentials. Such approaches often rely on simple parameterizations of one-dimensional bond potentials, assumptions on equilibrium starting states, and/or large amounts of trajectory data. Parametric approaches typically fail at inferring complex-shaped bond potentials with multiple minima, while piecewise estimation may not guarantee smooth results with the appropriate behavior at large distances. Existing techniques, particularly those based on work theorems, also do not address spatial variations in the diffusivity that may arise from spatially inhomogeneous coupling to...
North American regional climate reconstruction from ground surface temperature histories
Jaume-Santero, Fernando; Pickler, Carolyne; Beltrami, Hugo; Mareschal, Jean-Claude
2016-12-01
Within the framework of the PAGES NAm2k project, 510 North American borehole temperature-depth profiles were analyzed to infer recent climate changes. To facilitate comparisons and to study the same time period, the profiles were truncated at 300 m. Ground surface temperature histories for the last 500 years were obtained for a model describing temperature changes at the surface for several climate-differentiated regions in North America. The evaluation of the model is done by inversion of temperature perturbations using singular value decomposition and its solutions are assessed using a Monte Carlo approach. The results within 95 % confidence interval suggest a warming between 1.0 and 2.5 K during the last two centuries. A regional analysis, composed of mean temperature changes over the last 500 years and geographical maps of ground surface temperatures, show that all regions experienced warming, but this warming is not spatially uniform and is more marked in northern regions.
Niaz, Mansoor; Klassen, Stephen; McMillan, Barbara; Metz, Don
2010-01-01
The photoelectric effect is an important part of general physics textbooks. To study the presentation of this phenomenon, we have reconstructed six essential, history and philosophy of science (HPS)-related aspects of the events that culminated in Einstein proposing his hypothesis of lightquanta and the ensuing controversy within the scientific…
Paul, Sudeshna; Friedman, Alan M; Bailey-Kellogg, Chris; Craig, Bruce A
2013-04-01
The interatomic distance distribution, P(r), is a valuable tool for evaluating the structure of a molecule in solution and represents the maximum structural information that can be derived from solution scattering data without further assumptions. Most current instrumentation for scattering experiments (typically CCD detectors) generates a finely pixelated two-dimensional image. In contin-uation of the standard practice with earlier one-dimensional detectors, these images are typically reduced to a one-dimensional profile of scattering inten-sities, I(q), by circular averaging of the two-dimensional image. Indirect Fourier transformation methods are then used to reconstruct P(r) from I(q). Substantial advantages in data analysis, however, could be achieved by directly estimating the P(r) curve from the two-dimensional images. This article describes a Bayesian framework, using a Markov chain Monte Carlo method, for estimating the parameters of the indirect transform, and thus P(r), directly from the two-dimensional images. Using simulated detector images, it is demonstrated that this method yields P(r) curves nearly identical to the reference P(r). Furthermore, an approach for evaluating spatially correlated errors (such as those that arise from a detector point spread function) is evaluated. Accounting for these errors further improves the precision of the P(r) estimation. Experimental scattering data, where no ground truth reference P(r) is available, are used to demonstrate that this method yields a scattering and detector model that more closely reflects the two-dimensional data, as judged by smaller residuals in cross-validation, than P(r) obtained by indirect transformation of a one-dimensional profile. Finally, the method allows concurrent estimation of the beam center and Dmax, the longest interatomic distance in P(r), as part of the Bayesian Markov chain Monte Carlo method, reducing experimental effort and providing a well defined protocol for these
Bayesian reconstruction of disease outbreaks by combining epidemiologic and genomic data.
Directory of Open Access Journals (Sweden)
Thibaut Jombart
2014-01-01
Full Text Available Recent years have seen progress in the development of statistically rigorous frameworks to infer outbreak transmission trees ("who infected whom" from epidemiological and genetic data. Making use of pathogen genome sequences in such analyses remains a challenge, however, with a variety of heuristic approaches having been explored to date. We introduce a statistical method exploiting both pathogen sequences and collection dates to unravel the dynamics of densely sampled outbreaks. Our approach identifies likely transmission events and infers dates of infections, unobserved cases and separate introductions of the disease. It also proves useful for inferring numbers of secondary infections and identifying heterogeneous infectivity and super-spreaders. After testing our approach using simulations, we illustrate the method with the analysis of the beginning of the 2003 Singaporean outbreak of Severe Acute Respiratory Syndrome (SARS, providing new insights into the early stage of this epidemic. Our approach is the first tool for disease outbreak reconstruction from genetic data widely available as free software, the R package outbreaker. It is applicable to various densely sampled epidemics, and improves previous approaches by detecting unobserved and imported cases, as well as allowing multiple introductions of the pathogen. Because of its generality, we believe this method will become a tool of choice for the analysis of densely sampled disease outbreaks, and will form a rigorous framework for subsequent methodological developments.
Directory of Open Access Journals (Sweden)
Stella Xu
2012-05-01
Full Text Available The ancient history of Korea has been one of the most controversial and difficult phases to incorporate into an East Asian history survey class, not only because there are indeed quite a number of contested issues, but also because very few updated materials are available in English. This essay aims to provide a comprehensive and critical overview of research on the topic of Korean ancient history in the past six decades (mainly in South Korea, so that the ancient history of Korea can be understood first within the broader frame of East Asian history, and then in relation to the intellectual and ideological evolution which has significantly impacted historical interpretations in South Korea.
Memory, History and Narrative: Shifts of Meaning when (Reconstructing the Past
Directory of Open Access Journals (Sweden)
Ignacio Brescó de Luna
2012-05-01
Full Text Available This paper is devoted to the examination of some socio-cultural dimensions of memory, focusing on narratives as a meditational tool (Vygotsky, 1978 for the construction of past events and attribution of meaning. The five elements of Kenneth Burke’s Grammar of Motives (1969 are taken as a framework for the examination of reconstructions of the past and particularly of histories, namely: 1 the interpretative and reconstructive action of 2 a positioned agent operating 3 through narrative means 4 addressed to particular purposes 5 within a concrete social and temporal scenery. The reflexive character of such approach opens the ground for considering remembering as one kind of act performed within the context of a set of on-going actions, so that remembrances play a directive role for action and so have an unavoidable moral dimension. This is particularly relevant for some kinds of social memory such as history teaching and their effects upon identity.
Fortunato, Laura
2011-02-01
Explanations for the emergence of monogamous marriage have focused on the cross-cultural distribution of marriage strategies, thus failing to account for their history. In this paper I reconstruct the pattern of change in marriage strategies in the history of societies speaking Indo-European languages, using cross-cultural data in the systematic and explicitly historical framework afforded by the phylogenetic comparative approach. The analysis provides evidence in support of Proto-Indo-European monogamy, and that this pattern may have extended back to Proto-Indo-Hittite. These reconstructions push the origin of monogamous marriage into prehistory, well beyond the earliest instances documented in the historical record; this, in turn, challenges notions that the cross-cultural distribution of monogamous marriage reflects features of social organization typically associated with Eurasian societies, and with "societal complexity" and "modernization" more generally. I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization.
Equivalent thermal history reconstruction from a partially crystallized glass-ceramic sensor array
Heeg, Bauke
2015-11-01
The basic concept of a thermal history sensor is that it records the accumulated exposure to some unknown, typically varying temperature profile for a certain amount of time. Such a sensor is considered to be capable of measuring the duration of several (N) temperature intervals. For this purpose, the sensor deploys multiple (M) sensing elements, each with different temperature sensitivity. At the end of some thermal exposure for a known period of time, the sensor array is read-out and an estimate is made of the set of N durations of the different temperature ranges. A potential implementation of such a sensor was pioneered by Fair et al. [Sens. Actuators, A 141, 245 (2008)], based on glass-ceramic materials with different temperature-dependent crystallization dynamics. In their work, it was demonstrated that an array of sensor elements can be made sensitive to slight differences in temperature history. Further, a forward crystallization model was used to simulate the variations in sensor array response to differences in the temperature history. The current paper focusses on the inverse aspect of temperature history reconstruction from a hypothetical sensor array output. The goal of such a reconstruction is to find an equivalent thermal history that is the closest representation of the true thermal history, i.e., the durations of a set of temperature intervals that result in a set of fractional crystallization values which is closest to the one resulting from the true thermal history. One particular useful simplification in both the sensor model as well as in its practical implementation is the omission of nucleation effects. In that case, least squares models can be used to approximate the sensor response and make reconstruction estimates. Even with this simplification, sensor noise can have a destabilizing effect on possible reconstruction solutions, which is evaluated using simulations. Both regularization and non-negativity constrained least squares
Reconstructing the photometric light curves of Earth as a planet along its history
Sanromá, Esther; Pallé, Enric
2011-01-01
By utilizing satellite-based estimations of the distribution of clouds, we have studied the Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of the Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the lan...
Directory of Open Access Journals (Sweden)
Kevin McNally
2012-01-01
Full Text Available There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.
McNally, Kevin; Cotton, Richard; Cocker, John; Jones, Kate; Bartels, Mike; Rick, David; Price, Paul; Loizou, George
2012-01-01
There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure.
Suchard, Marc A.
2017-01-01
Ancestral state reconstructions in Bayesian phylogeography of virus pandemics have been improved by utilizing a Bayesian stochastic search variable selection (BSSVS) framework. Recently, this framework has been extended to model the transition rate matrix between discrete states as a generalized linear model (GLM) of genetic, geographic, demographic, and environmental predictors of interest to the virus and incorporating BSSVS to estimate the posterior inclusion probabilities of each predictor. Although the latter appears to enhance the biological validity of ancestral state reconstruction, there has yet to be a comparison of phylogenies created by the two methods. In this paper, we compare these two methods, while also using a primitive method without BSSVS, and highlight the differences in phylogenies created by each. We test six coalescent priors and six random sequence samples of H3N2 influenza during the 2014–15 flu season in the U.S. We show that the GLMs yield significantly greater root state posterior probabilities than the two alternative methods under five of the six priors, and significantly greater Kullback-Leibler divergence values than the two alternative methods under all priors. Furthermore, the GLMs strongly implicate temperature and precipitation as driving forces of this flu season and nearly unanimously identified a single root state, which exhibits the most tropical climate during a typical flu season in the U.S. The GLM, however, appears to be highly susceptible to sampling bias compared with the other methods, which casts doubt on whether its reconstructions should be favored over those created by alternate methods. We report that a BSSVS approach with a Poisson prior demonstrates less bias toward sample size under certain conditions than the GLMs or primitive models, and believe that the connection between reconstruction method and sampling bias warrants further investigation. PMID:28170397
Model-Independent Reconstruction of the Expansion History of the Universe from Type Ia Supernovae
Benitez-Herrera, S; Hillebrandt, W; Mignone, C; Bartelmann, M; Weller, J
2011-01-01
Based on the largest homogeneously reduced set of Type Ia supernova luminosity data currently available -- the Union2 sample -- we reconstruct the expansion history of the Universe in a model-independent approach. Our method tests the geometry of the Universe directly without reverting to any assumptions made on its energy content. This allows us to constrain Dark Energy models and non-standard cosmologies in a straightforward way. The applicability of the presented method is not restricted to testing cosmological models. It can be a valuable tool for pointing out systematic errors hidden in the supernova data and planning future Type Ia supernova cosmology campaigns.
Joint palaeoclimate reconstruction from pollen data via forward models and climate histories
Parnell, Andrew C.; Haslett, John; Sweeney, James; Doan, Thinh K.; Allen, Judy R. M.; Huntley, Brian
2016-11-01
We present a method and software for reconstructing palaeoclimate from pollen data with a focus on accounting for and reducing uncertainty. The tools we use include: forward models, which enable us to account for the data generating process and hence the complex relationship between pollen and climate; joint inference, which reduces uncertainty by borrowing strength between aspects of climate and slices of the core; and dynamic climate histories, which allow for a far richer gamut of inferential possibilities. Through a Monte Carlo approach we generate numerous equally probable joint climate histories, each of which is represented by a sequence of values of three climate dimensions in discrete time, i.e. a multivariate time series. All histories are consistent with the uncertainties in the forward model and the natural temporal variability in climate. Once generated, these histories can provide most probable climate estimates with uncertainty intervals. This is particularly important as attention moves to the dynamics of past climate changes. For example, such methods allow us to identify, with realistic uncertainty, the past century that exhibited the greatest warming. We illustrate our method with two data sets: Laguna de la Roya, with a radiocarbon dated chronology and hence timing uncertainty; and Lago Grande di Monticchio, which contains laminated sediment and extends back to the penultimate glacial stage. The procedure is made available via an open source R package, Bclim, for which we provide code and instructions.
Tsai, Ming-Chi; Blelloch, Guy; Ravi, R.; Schwartz, Russell
The random accumulation of variations in the human genome over time implicitly encodes a history of how human populations have arisen, dispersed, and intermixed since we emerged as a species. Reconstructing that history is a challenging computational and statistical problem but has important applications both to basic research and to the discovery of genotype-phenotype correlations. In this study, we present a novel approach to inferring human evolutionary history from genetic variation data. Our approach uses the idea of consensus trees, a technique generally used to reconcile species trees from divergent gene trees, adapting it to the problem of finding the robust relationships within a set of intraspecies phylogenies derived from local regions of the genome. We assess the quality of the method on two large-scale genetic variation data sets: the HapMap Phase II and the Human Genome Diversity Project. Qualitative comparison to a consensus model of the evolution of modern human population groups shows that our inferences closely match our best current understanding of human evolutionary history. A further comparison with results of a leading method for the simpler problem of population substructure assignment verifies that our method provides comparable accuracy in identifying meaningful population subgroups in addition to inferring the relationships among them.
Directory of Open Access Journals (Sweden)
Simon Boitard
2016-03-01
Full Text Available Inferring the ancestral dynamics of effective population size is a long-standing question in population genetics, which can now be tackled much more accurately thanks to the massive genomic data available in many species. Several promising methods that take advantage of whole-genome sequences have been recently developed in this context. However, they can only be applied to rather small samples, which limits their ability to estimate recent population size history. Besides, they can be very sensitive to sequencing or phasing errors. Here we introduce a new approximate Bayesian computation approach named PopSizeABC that allows estimating the evolution of the effective population size through time, using a large sample of complete genomes. This sample is summarized using the folded allele frequency spectrum and the average zygotic linkage disequilibrium at different bins of physical distance, two classes of statistics that are widely used in population genetics and can be easily computed from unphased and unpolarized SNP data. Our approach provides accurate estimations of past population sizes, from the very first generations before present back to the expected time to the most recent common ancestor of the sample, as shown by simulations under a wide range of demographic scenarios. When applied to samples of 15 or 25 complete genomes in four cattle breeds (Angus, Fleckvieh, Holstein and Jersey, PopSizeABC revealed a series of population declines, related to historical events such as domestication or modern breed creation. We further highlight that our approach is robust to sequencing errors, provided summary statistics are computed from SNPs with common alleles.
Fortunato, Laura
2011-02-01
Linguists and archaeologists have used reconstructions of early Indo-European residence strategies to constrain hypotheses about the homeland and trajectory of dispersal of Indo-European languages; however, these reconstructions are largely based on unsystematic and a historical use of the linguistic and ethnographic evidence, coupled with substantial bias in interpretation. Here I use cross-cultural data in a phylogenetic comparative framework to reconstruct the pattern of change in residence strategies in the history of societies speaking Indo-European languages. The analysis provides evidence in support of prevailing virilocality with alternative neolocality for Proto-Indo-European, and that this pattern may have extended back to Proto-Indo-Hittite. These findings bolster interpretations of the archaeological evidence that emphasize the "non-matricentric" structure of early Indo-European society; however, they also counter the notion that early Indo-European society was strongly "patricentric." I discuss implications of these findings in the context of the archaeological and genetic evidence on prehistoric social organization.
Directory of Open Access Journals (Sweden)
Thomas Katagis
2014-06-01
Full Text Available In this study, the capability of geographic object-based image analysis (GEOBIA in the reconstruction of the recent fire history of a typical Mediterranean area was investigated. More specifically, a semi-automated GEOBIA procedure was developed and tested on archived and newly acquired Landsat Multispectral Scanner (MSS, Thematic Mapper (TM, and Operational Land Imager (OLI images in order to accurately map burned areas in the Mediterranean island of Thasos. The developed GEOBIA ruleset was built with the use of the TM image and then applied to the other two images. This process of transferring the ruleset did not require substantial adjustments or any replacement of the initially selected features used for the classification, thus, displaying reduced complexity in processing the images. As a result, burned area maps of very high accuracy (over 94% overall were produced. In addition to the standard error matrix, the employment of additional measures of agreement between the produced maps and the reference data revealed that “spatial misplacement” was the main source of classification error. It can be concluded that the proposed approach can be potentially used for reconstructing the recent (40-year fire history in the Mediterranean, based on extended time series of Landsat or similar data.
Xu, Yunfei; Dass, Sarat; Maiti, Tapabrata
2016-01-01
This brief introduces a class of problems and models for the prediction of the scalar field of interest from noisy observations collected by mobile sensor networks. It also introduces the problem of optimal coordination of robotic sensors to maximize the prediction quality subject to communication and mobility constraints either in a centralized or distributed manner. To solve such problems, fully Bayesian approaches are adopted, allowing various sources of uncertainties to be integrated into an inferential framework effectively capturing all aspects of variability involved. The fully Bayesian approach also allows the most appropriate values for additional model parameters to be selected automatically by data, and the optimal inference and prediction for the underlying scalar field to be achieved. In particular, spatio-temporal Gaussian process regression is formulated for robotic sensors to fuse multifactorial effects of observations, measurement noise, and prior distributions for obtaining the predictive di...
Singh, Gurmeet; Nguyen, Thanh; Kressler, Bryan; Spincemaille, Pascal; Raj, Ashish; Zabih, Ramin; Wang, Yi
2006-01-01
High resolution 3D coronary artery MR angiography is time-consuming and can benefit from accelerated data acquisition provided by parallel imaging techniques without sacrificing spatial resolution. Currently, popular maximum likelihood based parallel imaging reconstruction techniques such as the SENSE algorithm offer this advantage at the cost of reduced signal-to-noise ratio (SNR). Maximum a posteriori (MAP) reconstruction techniques that incorporate globally smooth priors have been developed to recover this SNR loss, but they tend to blur sharp edges in the target image. The objective of this study is to demonstrate the feasibility of employing edge-preserving Markov random field priors in a MAP reconstruction framework, which can be solved efficiently using a graph cuts based optimization algorithm. The preliminary human study shows that our reconstruction provides significantly better SNR than the SENSE reconstruction performed by a commercially available scanner for navigator gated steady state free precession 3D coronary magnetic resonance angiography images (n = 4).
Oddi, Facundo; Ghermandi, Luciana; Lasaponara, Rosa
2014-05-01
Fire recurrently affects many of the terrestrial ecosystems causing major implications on the structure and dynamics of vegetation. In fire prone, it is particularly important to know the fire regime for which precise fire records are needed. Dendroecology offers the possibility of obtaining fire occurrence data from woody species and has been widely used in forest ecosystems for fire research. Grasslands are regions with no trees but shrubs could be used to acquire dendroecological information in order to reconstructing fire history at landscape scale. We studied the dendroecological potential of shrub F. imbricata to reconstruct fire history at landscape scale in a fire prone grassland of northwestern Patagonia. To do this, we combined spatio-temporal information of recorded fires within the study area with the age structure of F. imbricata shrublands derived by dendroecology. Sampling sites were located over 2500 ha in San Ramón ranch, 30 km east from Bariloche, Río Negro province, Argentina (latitude -41° 04'; longitude -70° 51'). Shrubland age structure correctly described how fires occurred in the past. Pulses of individuals' recruitment were associated with fire in time and space. A bi-variate analysis showed that F. imbricata recruits individuals during the two years after fire and spatial distribution of pulses coincided with the fire map. In sites without fire data, the age structure allowed the identification of two additional fires. Our results show that shrub F. imbricata can be employed with other data sources such as remote sensing and operational databases to improve knowledge on fire regime in northwestern Patagonia grasslands. In conclusion, we raise the possibility of utilizing shrubs as a dendroecological data source to study fire history in grasslands where tree cover is absent.
Alperson-Afil, Nira
2012-07-01
Concepts which are common in the reconstruction of fire histories are employed here for the purpose of interpreting fires identified at archaeological sites. When attempting to evaluate the fire history of ancient occupations we are limited by the amount and quality of the available data. Furthermore, the identification of archaeological burned materials, such as stone, wood, and charcoal, is adequate for the general assumption of a "fire history", but the agent responsible - anthropogenic or natural - cannot be inferred from the mere presence of burned items. The large body of scientific data that has accumulated, primarily through efforts to prevent future fire disasters, enables us to reconstruct scenarios of past natural fires. Adopting this line of thought, this paper attempts to evaluate the circumstances in which a natural fire may have ignited and spread at the 0.79 Ma occupation site of Gesher Benot Ya'aqov (Israel), resulting with burned wood and burned flint within the archaeological layers. At Gesher Benot Ya'aqov, possible remnants of hearths are explored through analyses of the spatial distribution of burned flint-knapping waste products. These occur in dense clusters in each of the archaeological occupations throughout the long stratigraphic sequence. In this study, the combination between the spatial analyses results, paleoenvironmental information, and various factors involved in the complex process of fire ignition, combustion, and behavior, has enabled the firm rejection of recurrent natural fires as the responsible agent for the burned materials. In addition, it suggested that mainly at early sites, where evidence for burning is present yet scarce, data on fire ecology can be particularly useful when it is considered in relation to paleoenvironmental information.
Directory of Open Access Journals (Sweden)
Wei Song Hwang
Full Text Available Assassin bugs are one of the most successful clades of predatory animals based on their species numbers (∼6,800 spp. and wide distribution in terrestrial ecosystems. Various novel prey capture strategies and remarkable prey specializations contribute to their appeal as a model to study evolutionary pathways involved in predation. Here, we reconstruct the most comprehensive reduviid phylogeny (178 taxa, 18 subfamilies to date based on molecular data (5 markers. This phylogeny tests current hypotheses on reduviid relationships emphasizing the polyphyletic Reduviinae and the blood-feeding, disease-vectoring Triatominae, and allows us, for the first time in assassin bugs, to reconstruct ancestral states of prey associations and microhabitats. Using a fossil-calibrated molecular tree, we estimated divergence times for key events in the evolutionary history of Reduviidae. Our results indicate that the polyphyletic Reduviinae fall into 11-14 separate clades. Triatominae are paraphyletic with respect to the reduviine genus Opisthacidius in the maximum likelihood analyses; this result is in contrast to prior hypotheses that found Triatominae to be monophyletic or polyphyletic and may be due to the more comprehensive taxon and character sampling in this study. The evolution of blood-feeding may thus have occurred once or twice independently among predatory assassin bugs. All prey specialists evolved from generalist ancestors, with multiple evolutionary origins of termite and ant specializations. A bark-associated life style on tree trunks is ancestral for most of the lineages of Higher Reduviidae; living on foliage has evolved at least six times independently. Reduviidae originated in the Middle Jurassic (178 Ma, but significant lineage diversification only began in the Late Cretaceous (97 Ma. The integration of molecular phylogenetics with fossil and life history data as presented in this paper provides insights into the evolutionary history of
Reconstruction of Galaxy Star Formation Histories through SED Fitting: The Dense Basis Approach
Iyer, Kartheik; Gawiser, Eric J.
2017-01-01
The standard assumption of a simplified parametric form for galaxy Star Formation Histories (SFHs) during Spectral Energy Distribution (SED) fitting biases estimations of physical quantities (Stellar Mass, SFR, age) and underestimates their true uncertainties. Here, we describe the Dense Basis formalism, which uses an atlas of well-motivated basis SFHs to provide robust reconstructions of galaxy SFHs and provides estimates of previously inaccessible quantities like the number of star formation episodes in a galaxy's past. We train and validate the method using a sample of realistic SFHs at z=1 drawn from current Semi Analytic Models and Hydrodynamical simulations, as well as SFHs generated using a stochastic prescription. We then apply the method on ~1100 CANDELS galaxies at 1high S/N SEDs for the N~O(10^8) galaxies from the upcoming generation of surveys including LSST, HETDEX and J-PAS.
Reconstructing the photometric light curves of Earth as a planet along its history
Sanromá, Esther
2011-01-01
By utilizing satellite-based estimations of the distribution of clouds, we have studied the Earth's large-scale cloudiness behavior according to latitude and surface types (ice, water, vegetation and desert). These empirical relationships are used here to reconstruct the possible cloud distribution of historical epochs of the Earth's history such as the Late Cretaceous (90 Ma ago), the Late Triassic (230 Ma ago), the Mississippian (340 Ma ago), and the Late Cambrian (500 Ma ago), when the landmass distributions were different from today's. With this information, we have been able to simulate the globally-integrated photometric variability of the planet at these epochs. We find that our simple model reproduces well the observed cloud distribution and albedo variability of the modern Earth. Moreover, the model suggests that the photometric variability of the Earth was probably much larger in past epochs. This large photometric variability could improve the chances for the difficult determination of the rotation...
Avanzo, Salvatore; Barbera, Roberto; de Mattia, Francesco; Rocca, Giuseppe La; Sorrentino, Mariapaola; Vicinanza, Domenico
ASTRA (Ancient instruments Sound/Timbre Reconstruction Application) is a project coordinated at Conservatory of Music of Parma which aims to bring history to life. Ancient musical instruments can now be heard for the first time in hundreds of years, thanks to the successful synergy between art/humanities and science. The Epigonion, an instrument of the past, has been digitally recreated using gLite, an advanced middleware developed in the context of the EGEE project and research networks such as GÉANT2 in Europe and EUMEDCONNECT2 in the Mediterranean region. GÉANT2 and EUMEDCONNECT2, by connecting enormous and heterogeneous computing resources, provided the needed infrastructures to speed up the overall computation time and enable the computer-intensive modeling of musical sounds. This paper summarizes the most recent outcomes of the project underlining how the Grid aspect of the computation can support the Cultural Heritage community.
Teo, Thomas
2013-02-01
After suggesting that all psychologies contain indigenous qualities and discussing differences and commonalities between German and North American historiographies of psychology, an indigenous reconstruction of German critical psychology is applied. It is argued that German critical psychology can be understood as a backlash against American psychology, as a response to the Americanization of German psychology after WWII, on the background of the history of German psychology, the academic impact of the Cold War, and the trajectory of personal biographies and institutions. Using an intellectual-historical perspective, it is shown how and which indigenous dimensions played a role in the development of German critical psychology as well as the limitations to such an historical approach. Expanding from German critical psychology, the role of the critique of American psychology in various contexts around the globe is discussed in order to emphasize the relevance of indigenous historical research.
Langmore, Ian; Davis, Anthony B.; Bal, Guillaume; Marzouk, Youssef M.
2012-01-01
We describe a method for accelerating a 3D Monte Carlo forward radiative transfer model to the point where it can be used in a new kind of Bayesian retrieval framework. The remote sensing challenge is to detect and quantify a chemical effluent of a known absorbing gas produced by an industrial facility in a deep valley. The available data is a single low resolution noisy image of the scene in the near IR at an absorbing wavelength for the gas of interest. The detected sunlight has been multiply reflected by the variable terrain and/or scattered by an aerosol that is assumed partially known and partially unknown. We thus introduce a new class of remote sensing algorithms best described as "multi-pixel" techniques that call necessarily for a 3D radaitive transfer model (but demonstrated here in 2D); they can be added to conventional ones that exploit typically multi- or hyper-spectral data, sometimes with multi-angle capability, with or without information about polarization. The novel Bayesian inference methodology uses adaptively, with efficiency in mind, the fact that a Monte Carlo forward model has a known and controllable uncertainty depending on the number of sun-to-detector paths used.
A general reconstruction of the recent expansion history of the universe
Vitenti, S D P
2015-01-01
Distance measurements are currently the most powerful tool to study the expansion history of the universe without specifying its matter content nor any theory of gravitation. Assuming only an isotropic, homogeneous and flat universe, in this work we introduce a model-independent method to reconstruct directly the deceleration function via a piecewise function. Including a penalty factor, we are able to vary continuously the complexity of the deceleration function from a linear case to an arbitrary $(n+1)$-knots spline interpolation. We carry out a Monte Carlo analysis to determine the best penalty factor, evaluating the bias-variance trade-off, given the uncertainties of the SDSS-II and SNLS supernova combined sample (JLA), compilations of baryon acoustic oscillation (BAO) and $H(z)$ data. We show that, evaluating a single fiducial model, the conclusions about the bias-variance ratio are misleading. We determine the reconstruction method in which the bias represents at most $10\\%$ of the total uncertainty. In...
Confederate Immigration to Brazil: A Cross-Cultural Approach to Reconstruction and Public History
Directory of Open Access Journals (Sweden)
Karina Esposito
2015-12-01
Full Text Available Given the interconnectedness of the contemporary world, it is imperative that historians place their studies within a global context, connecting domestic and foreign events in order to offer a thorough picture of the past. As historians, we should aim at exploring transnational connections in our published research and incorporating the same methodologies in the classroom, as well as in the field of Public History. Cross-cultural collaboration and transnational studies are challenging, but exceptionally effective approaches to developing a comprehensive understanding of the past and connecting people to their history. Important recent scholarship has placed the American Civil War in a broad international and transnational context. This article argues for the importance of continuing this trend, pointing to a unique case study: the confederate migration to Brazil during and after the Civil War. This episode can help us understand the international impact of the War in the western hemisphere. These confederates attempted to preserve some aspects of their Southern society by migrating to Brazil, one of the remaining slaveholding societies in the hemisphere at the time. Moreover, the descendants that remained in Brazil have engaged in a unique process of remembering and commemorating their heritage over the years. Exploring this migration will enhance Civil War and Reconstruction historiography, as well as commemoration, heritage and memory studies.
Reconstructing a multi-centennial drought history for the British Isles
Macdonald, Neil; Chiverrell, Richard; Todd, Beverley; Bowen, James; Lennard, Amy
2016-04-01
The last two decades have witnessed some of the most severe droughts experienced within living memory in the UK, but have these droughts really been exceptional? Relatively few instrumental river flow, groundwater or reservoir series extend beyond 50 years in length, with few precipitation series currently available extending over 100 years. These relatively short series present considerable challenges in determining current and future drought risk, with the results affecting society and the economy. This study uses long instrumental precipitation series coupled with the SPI and scPDSi drought indices to reconstruct drought histories from different parts of the British Isles. Existing long precipitation series have been reassessed and several new precipitation series reconstructed (e.g. Carlisle 1757), with eight series now over 200 years in length, and a further thirteen over 150 years, with further sites currently being developed (e.g. Norwich, 1749-; Exeter, 1724-). This study will focuses on the eight longest series, with shorter series used to help explore spatial and temporal variability across British Isles. We show how historical series have improved understanding of severe droughts, by examining past spatial and temporal variability and exploring the climatic drivers responsible. It shows that recent well documented droughts (e.g. 1976; 1996 and 2010) which have shaped both public and water resource managers' perceptions of risk, have historically been exceeded in both severity (e.g. 1781) and duration (e.g. 1798-1810); with the largest droughts often transcending single catchments and affecting regions. Recent droughts are not exceptional when considered within a multi-centennial timescale, with improved understanding of historical events raising concerns in contemporary water resource management.
Jelinek, A. R.; Chemale, F., Jr.
2012-12-01
In this work we deal with the Phanerozoic history of the Southern Mantiqueira Province and adjacent areas after the orogen-collapse of the Brasiliano orogenic mountains in southern Brazil and Uruguay, based on thermocronological data (fission track and U-Th/He on apatite) and thermal history modelling. During the Paleozoic intraplate sedimentary basins formed mainly bordering the orogenic systems, and thus, these regions have not been overprinted by younger orogenic processes. In the Mesocenozoic this region was affected by later fragmentation and dispersal due to the separation of South America and Africa. Denudation history of both margins quantified on the basis of thermal history modeling of apatite fission track thermocronology indicates that the margin of southeastern Brazil and Uruguay presented a minimum 3.5 to 4.5 Km of denudation, which included the main exposure area of the Brasiliano orogenic belts and adjacent areas. The Phanerozoic evolution of the West Gondawana is thus recorded first by the orogenetic collapses of the Brasiliano and Pan-African belts, at that time formed a single mountain system in the Cambrian-Ordovician period. Subsequentlly, formed the intraplate basins as Paraná, in southeastern Brazil, and Congo and some records of the Table Mountains Group and upper section of Karoo units, in Southwestern Africa. In Permotriassic period, the collision of the Cape Fold Belt and Sierra de la Ventana Belt at the margins of the West Gondwana supercontinent resulted an elastic deformation in the cratonic areas, where the intraplate depositional basin occurred, and also subsidence and uplift of the already established Pan-African-Brasiliano Belts. Younger denudation events, due to continental margin uplift and basin subsidence, occurred during the rifting and dispersal of the South America and Africa plates, which can be very well defined by the integration of the passive-margin sedimentation of the Pelotas and Santos basins and apatite fission
Olive, Marie-Marie; Grosbois, Vladimir; Tran, Annelise; Nomenjanahary, Lalaina Arivony; Rakotoarinoro, Mihaja; Andriamandimby, Soa-Fy; Rogier, Christophe; Heraud, Jean-Michel; Chevalier, Veronique
2017-01-01
The force of infection (FOI) is one of the key parameters describing the dynamics of transmission of vector-borne diseases. Following the occurrence of two major outbreaks of Rift Valley fever (RVF) in Madagascar in 1990–91 and 2008–09, recent studies suggest that the pattern of RVF virus (RVFV) transmission differed among the four main eco-regions (East, Highlands, North-West and South-West). Using Bayesian hierarchical models fitted to serological data from cattle of known age collected during two surveys (2008 and 2014), we estimated RVF FOI and described its variations over time and space in Madagascar. We show that the patterns of RVFV transmission strongly differed among the eco-regions. In the North-West and Highlands regions, these patterns were synchronous with a high intensity in mid-2007/mid-2008. In the East and South-West, the peaks of transmission were later, between mid-2008 and mid-2010. In the warm and humid northwestern eco-region favorable to mosquito populations, RVFV is probably transmitted all year-long at low-level during inter-epizootic period allowing its maintenance and being regularly introduced in the Highlands through ruminant trade. The RVF surveillance of animals of the northwestern region could be used as an early warning indicator of an increased risk of RVF outbreak in Madagascar. PMID:28051125
Quantitative study on pollen-based reconstructions of vegetation history from central Canada
Institute of Scientific and Technical Information of China (English)
YU Ge; HART Catherina; VETTER Mary; SAUCHYN David
2008-01-01
Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W,40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem Ⅱ. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.
Effort to reconstruct past population history in the fern Blechnum spicant.
Korpelainen, Helena; Pietiläinen, Maria
2008-05-01
Our aim was to evaluate the potential of existing herbarium collections in reconstructing the past population history of the rediscovered clone of the regionally extinct Blechnum spicant by comparing it with herbarium material from previous sites of occurrences in Finland and elsewhere in northern Europe, as well as to reveal the genetic and geographic relationship among the samples. We detected a total of nine polymorphic sites within the three sequenced regions, two SCAR markers developed now for B. spicant and the chloroplast trnL-trnF spacer, totalling 763 bp. Despite low variability, the phylogeographic analysis revealed the presence of some geographic pattern among the samples, of which all except one represented herbarium samples of up to 100 years of age. The rediscovered clone of B. spicant proved to represent the prevalent genotype occurring in northern Europe. The studied sequences were neutral in terms of evolution. It is apparent that existing herbarium collections are useful resources for a range of evolutionary and population studies.
Quantitative study on pollen-based reconstructions of vegetation history from central Canada
Institute of Scientific and Technical Information of China (English)
HART; Catherina; VETTER; Mary; SAUCHYN; David
2008-01-01
Based on high-resolution pollen records from lake cores in central Canada, the present study instructed pollen taxa assignations in ecosystem groups and modern analogue technique, reported major results of quantitative reconstructions of vegetation history during the last 1000 years, and discussed the validation of simulated vegetation. The results showed that in central America (115°-95°W, 40°-60°N), best analogue of the modern vegetation is 81% for boreal forest, 72% for parkland, and 94% for grassland-parkland, which are consistent with vegetation distributions of the North American Ecosystem II. Simulations of the past vegetation from the sedimentary pollen showed climate changes during the past 1000 years: it was warm and dry in the Medieval Warm period, cold and wet in the earlier period and cold and dry in the later period of the Little Ice Age. It became obviously increasing warm and drought in the 20th century. The present studies would provide us scientific basis to understand vegetation and climate changes during the last 1000 years in a characteristic region and in 10-100 year time scales.
Energy Technology Data Exchange (ETDEWEB)
Burnier, Yannis [Institut de Théorie des Phénomènes Physiques, Ecole Polytechnique Fédérale de Lausanne, CH-1015, Lausanne (Switzerland); Kaczmarek, Olaf [Fakultät für Physik, Universität Bielefeld, D-33615 Bielefeld (Germany); Rothkopf, Alexander [Institute for Theoretical Physics, Heidelberg University, Philosophenweg 16, D-69120 Heidelberg (Germany)
2016-01-22
We report recent results of a non-perturbative determination of the static heavy-quark potential in quenched and dynamical lattice QCD at finite temperature. The real and imaginary part of this complex quantity are extracted from the spectral function of Wilson line correlators in Coulomb gauge. To obtain spectral information from Euclidean time numerical data, our study relies on a novel Bayesian prescription that differs from the Maximum Entropy Method. We perform simulations on quenched 32{sup 3} × N{sub τ} (β = 7.0, ξ = 3.5) lattices with N{sub τ} = 24, …, 96, which cover 839MeV ≥ T ≥ 210MeV. To investigate the potential in a quark-gluon plasma with light u,d and s quarks we utilize N{sub f} = 2 + 1 ASQTAD lattices with m{sub l} = m{sub s}/20 by the HotQCD collaboration, giving access to temperatures between 286MeV ≥ T ≥ 148MeV. The real part of the potential exhibits a clean transition from a linear, confining behavior in the hadronic phase to a Debye screened form above deconfinement. Interestingly its values lie close to the color singlet free energies in Coulomb gauge at all temperatures. We estimate the imaginary part on quenched lattices and find that it is of the same order of magnitude as in hard-thermal loop perturbation theory. From among all the systematic checks carried out in our study, we discuss explicitly the dependence of the result on the default model and the number of datapoints.
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Hall, Matthew D; Woolhouse, Mark E J; Rambaut, Andrew
2016-01-01
The ongoing large-scale increase in the total amount of genetic data for viruses and other pathogens has led to a situation in which it is often not possible to include every available sequence in a phylogenetic analysis and expect the procedure to complete in reasonable computational time. This raises questions about how a set of sequences should be selected for analysis, particularly if the data are used to infer more than just the phylogenetic tree itself. The design of sampling strategies for molecular epidemiology has been a neglected field of research. This article describes a large-scale simulation exercise that was undertaken to select an appropriate strategy when using the GMRF skygrid, one of the Bayesian skyline family of coalescent methods, in order to reconstruct past population dynamics. The simulated scenarios were intended to represent sampling for the population of an endemic virus across multiple geographical locations. Large phylogenies were simulated under a coalescent or structured coalescent model and sequences simulated from these trees; the resulting datasets were then downsampled for analyses according to a variety of schemes. Variation in results between different replicates of the same scheme was not insignificant, and as a result, we recommend that where possible analyses are repeated with different datasets in order to establish that elements of a reconstruction are not simply the result of the particular set of samples selected. We show that an individual stochastic choice of sequences can introduce spurious behaviour in the median line of the skygrid plot and that even marginal likelihood estimation can suggest complicated dynamics that were not in fact present. We recommend that the median line should not be used to infer historical events on its own. Sampling sequences with uniform probability with respect to both time and spatial location (deme) never performed worse than sampling with probability proportional to the effective
Directory of Open Access Journals (Sweden)
Stefano Zurrida
2011-01-01
Full Text Available Breast cancer is the most common cancer in women. Primary treatment is surgery, with mastectomy as the main treatment for most of the twentieth century. However, over that time, the extent of the procedure varied, and less extensive mastectomies are employed today compared to those used in the past, as excessively mutilating procedures did not improve survival. Today, many women receive breast-conserving surgery, usually with radiotherapy to the residual breast, instead of mastectomy, as it has been shown to be as effective as mastectomy in early disease. The relatively new skin-sparing mastectomy, often with immediate breast reconstruction, improves aesthetic outcomes and is oncologically safe. Nipple-sparing mastectomy is newer and used increasingly, with better acceptance by patients, and again appears to be oncologically safe. Breast reconstruction is an important adjunct to mastectomy, as it has a positive psychological impact on the patient, contributing to improved quality of life.
Directory of Open Access Journals (Sweden)
Giulia Carreras
2012-09-01
Full Text Available
Background: parameter uncertainty in the Markov model’s description of a disease course was addressed. Probabilistic sensitivity analysis (PSA is now considered the only tool that properly permits parameter uncertainty’s examination. This consists in sampling values from the parameter’s probability distributions.
Methods: Markov models fitted with microsimulation were considered and methods for carrying out a PSA on transition probabilities were studied. Two Bayesian solutions were developed: for each row of the modeled transition matrix the prior distribution was assumed as a product of Beta or a Dirichlet. The two solutions differ in the source of information: several different sources for each transition in the Beta approach and a single source for each transition from a given health state in the Dirichlet. The two methods were applied to a simple cervical cancer’s model.
Results : differences between posterior estimates from the two methods were negligible. Results showed that the prior variability highly influence the posterior distribution.
Conclusions: the novelty of this work is the Bayesian approach that integrates the two distributions with a product of Binomial distributions likelihood. Such methods could be also applied to cohort data and their application to more complex models could be useful and unique in the cervical cancer context, as well as in other disease modeling.
Reconstructing the Solar Wind from Its Early History to Current Epoch
Airapetian, Vladimir S.; Usmanov, Arcadi V.
2016-02-01
Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.
RECONSTRUCTING THE SOLAR WIND FROM ITS EARLY HISTORY TO CURRENT EPOCH
Energy Technology Data Exchange (ETDEWEB)
Airapetian, Vladimir S.; Usmanov, Arcadi V., E-mail: vladimir.airapetian@nasa.gov, E-mail: avusmanov@gmail.com [NASA Goddard Space Flight Center, Greenbelt, MD (United States)
2016-02-01
Stellar winds from active solar-type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass-loss rates cannot be directly derived from observations. We employed a three-dimensional magnetohydrodynamic Alfvén wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass-loss rate, terminal velocity, and wind temperature at 0.7, 2, and 4.65 Gyr. Our model treats the wind thermal electrons, protons, and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfvén wave amplitude, and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constrains. Our model results show that the velocity of the paleo solar wind was twice as fast, ∼50 times denser and 2 times hotter at 1 AU in the Sun's early history at 0.7 Gyr. The theoretical calculations of mass-loss rate appear to be in agreement with the empirically derived values for stars of various ages. These results can provide realistic constraints for wind dynamic pressures on magnetospheres of (exo)planets around the young Sun and other active stars, which is crucial in realistic assessment of the Joule heating of their ionospheres and corresponding effects of atmospheric erosion.
Significance of "stretched" mineral inclusions for reconstructing P- T exhumation history
Ashley, Kyle T.; Darling, Robert S.; Bodnar, Robert J.; Law, Richard D.
2015-06-01
Analysis of mineral inclusions in chemically and physically resistant hosts has proven to be valuable for reconstructing the P- T exhumation history of high-grade metamorphic rocks. The occurrence of cristobalite-bearing inclusions in garnets from Gore Mountain, New York, is unexpected because the peak metamorphic conditions reached are well removed (>600 °C too cold) from the stability field of this low-density silica polymorph that typically forms in high temperature volcanic environments. A previous study of samples from this area interpreted polymineralic inclusions consisting of cristobalite, albite and ilmenite as representing crystallized droplets of melt generated during a garnet-in reaction, followed by water loss from the inclusion to explain the reduction in inclusion pressure that drove the transformation of quartz to cristobalite. However, the recent discovery of monomineralic inclusions of cristobalite from the nearby Hooper Mine cannot be explained by this process. For these inclusions, we propose that the volume response to pressure and temperature changes during exhumation to Earth's surface resulted in large tensile stresses within the silica phase that would be sufficient to cause transformation to the low-density (low-pressure) form. Elastic modeling of other common inclusion-host systems suggests that this quartz-to-cristobalite example may not be a unique case. The aluminosilicate polymorph kyanite also has the capacity to retain tensile stresses if exhumed to Earth's surface after being trapped as an inclusion in plagioclase at P- T conditions within the kyanite stability field, with the stresses developed during exhumation sufficient to produce a transformation to andalusite. These results highlight the elastic environment that may arise during exhumation and provide a potential explanation of observed inclusions whose stability fields are well removed from P- T paths followed during exhumation.
Gorostiza, Amaya; Acunha-Alonzo, Víctor; Regalado-Liu, Lucía; Tirado, Sergio; Granados, Julio; Sámano, David; Rangel-Villalobos, Héctor; González-Martín, Antonio
2012-01-01
The study of genetic information can reveal a reconstruction of human population’s history. We sequenced the entire mtDNA control region (positions 16.024 to 576 following Cambridge Reference Sequence, CRS) of 605 individuals from seven Mesoamerican indigenous groups and one Aridoamerican from the Greater Southwest previously defined, all of them in present Mexico. Samples were collected directly from the indigenous populations, the application of an individual survey made it possible to remove related or with other origins samples. Diversity indices and demographic estimates were calculated. Also AMOVAs were calculated according to different criteria. An MDS plot, based on FST distances, was also built. We carried out the construction of individual networks for the four Amerindian haplogroups detected. Finally, barrier software was applied to detect genetic boundaries among populations. The results suggest: a common origin of the indigenous groups; a small degree of European admixture; and inter-ethnic gene flow. The process of Mesoamerica’s human settlement took place quickly influenced by the region’s orography, which development of genetic and cultural differences facilitated. We find the existence of genetic structure is related to the region’s geography, rather than to cultural parameters, such as language. The human population gradually became fragmented, though they remained relatively isolated, and differentiated due to small population sizes and different survival strategies. Genetic differences were detected between Aridoamerica and Mesoamerica, which can be subdivided into “East”, “Center”, “West” and “Southeast”. The fragmentation process occurred mainly during the Mesoamerican Pre-Classic period, with the Otomí being one of the oldest groups. With an increased number of populations studied adding previously published data, there is no change in the conclusions, although significant genetic heterogeneity can be detected in Pima
Heine, Christian; Müller, R. Dietmar; Gaina, Carmen
Plate tectonic reconstructions for the late Mesozoic-Cenozoic evolution of the eastern Tethyan Ocean Basin, separating eastern Gondwanaland from Proto-Southeast Asia, are usually based on geological data gathered from the different tectonic blocks accreted to Southeast Asia. However, this approach only provides few constraints on the reconstruction of the eastern Tethys Ocean and the drift path of various terranes. We have used marine magnetic anomalies in the Argo and Gascoyne Abyssal Plains off the Australian Northwest Shelf, jointly with published geological data, to reconstruct the seafloor spreading history and plate tectonic evolution of the eastern Tethys and Proto-Indian Ocean basins for the time between 160 Ma and the present. Based on the assumption of symmetrical seafloor spreading and a hotspot-track-based plate reference frame, we have created a relative and absolute plate motion model and a series of oceanic paleo-age grids that show the evolution of Tethyan mid-ocean ridges and the convergence history along the southeast Asian margin through time. A thermal boundary layer model for oceanic lithosphere is used to compute approximate paleo-depths to oceanic basement to predict the opening and closing of oceanic gateways. The proposed model not only provides improved boundary conditions for paleoclimate reconstructions and modelling of oceanic currents through time, but also for understanding stress changes in the overriding plate and the formation of new accretionary crust along the Southeast Asian margin, driven by changing subduction parameters like hinge rollback and slab dip.
Das, O.; Wang, Y.; Donoghue, J. F.; Coor, J. L.; Kish, S.; Elsner, J.; Hu, X. B.; Niedoroda, A. W.; Ye, M.; Xu, Y.
2009-12-01
Analysis of geochemical proxies of coastal lake sediments provides a useful tool for reconstructing paleostorm history. Such paleostorm records can help constrain models that are used to predict future storm events. In this study, we collected two sediment cores (60 and 103 cm long, respectively) from the center of Eastern Lake located on the Gulf coast of NW Florida. These cores, which are mainly composed of organic-rich mud and organic-poor sand, were sub-sampled at 2-3mm intervals for analyses of their organic carbon and nitrogen concentrations as well as δ13C and δ15N isotopic signatures. Selected samples were submitted for radiocarbon dating in order to establish a chronological framework for the interpretation of the geochemical data. There are significant variations in δ13C, δ15N, C%, N% and C/N with depth. The δ13C and δ15N values vary from -21.8‰ to -26.7‰ and 2.6‰ to 5‰, respectively. The stable isotopic signatures of carbon and nitrogen indicate that the sources of organic matter in sediments include terrestrial C3 type vegetation, marine input from Gulf of Mexico and biological productivity within the lake, such as phytoplankton and zooplankton growing in the lacustrine environment. The δ13C and δ15N values exhibit significant negative excursions by 2‰ in a 30 cm thick sand layer, bounded by a rapid return to the base value. A positive shift in the δ15N record observed in the upper part of the cores likely reflects increased anthropogenic input of N such as sewage or septic tank effluents associated with recent development of areas around the lake for human habitation. Similarly, organic C% and N% range from 5.8 to 0.4 and 0.4 to 0.1, respectively. A prominent negative shift by 2σ relative to the baseline in C% and N% has been observed at approx. 55 to 58 cm depth, consisting of an organic-poor sand layer. This shift in C% and N% can be correlated with the negative shift in the δ13C and δ15N values, indicating a major storm event
Reconstructing the tectonic history of Fennoscandia from its margins: The past 100 million years
Energy Technology Data Exchange (ETDEWEB)
Muir Wood, R. [EQE International Ltd (United Kingdom)
1995-12-01
In the absence of onland late Mesozoic and Cenozoic geological formations the tectonic history of the Baltic Shield over the past 100 million years can be reconstructed from the thick sedimentary basins that surround Fennoscandia on three sides. Tectonic activity around Fennoscandia through this period has been diverse but can be divided into four main periods: a. pre North Atlantic spreading ridge (100-60 Ma) when transpressional deformation on the southern margins of Fennoscandia and transtensional activity to the west was associated with a NNE-SSW maximum compressive stress direction; b. the creation of the spreading ridge (60-45 Ma) when there was rifting along the western margin; c. the re-arrangement of spreading axes (45-25 Ma) when there was a radial compression around Fennoscandia, and d. the re-emergence of the Iceland hot-spot (25-0 Ma) when the stress-field has come to accord with ridge or plume `push`. Since 60 Ma the Alpine plate boundary has had little influence on Fennoscandia. The highest levels of deformation on the margins of Fennoscandia were achieved around 85 Ma, 60-55 Ma, with strain-rates around 10{sup -9}/year. Within the Baltic Shield long term strain rates have been around 10{sup -1}1/year, with little evidence for significant deformations passing into the shield from the margins. Fennoscandian Border Zone activity, which was prominent from 90-60 Ma, was largely abandoned following the creation of the Norwegian Sea spreading ridge, and with the exception of the Lofoten margin, there is subsequently little evidence for deformation passing into Fennoscandia. Renewal of modest compressional deformation in the Voering Basin suggest that the `Current Tectonic Regime` is of Quaternary age although the orientation of the major stress axis has remained consistent since around 10 Ma. The past pattern of changes suggest that in the geological near-future variations are to be anticipated in the magnitude rather than the orientation of stresses.
A unique opportunity to reconstruct the volcanic history of the island of Nevis, Lesser Antilles
Saginor, I.; Gazel, E.
2012-12-01
We report twelve new ICP-MS analyses and two 40Ar/39Ar ages for the Caribbean island of Nevis, located in the Lesser Antilles. These data show a very strong fractionation trend, suggesting that along strike variations may be primarily controlled by the interaction of rising magma with the upper plate. If this fractionation trend is shown to correlate with age, it may suggest that underplating of the crust is responsible for variations in the makeup of erupted lava over time, particularly with respect to silica content. We have recently been given permission to sample a series of cores being drilled by a geothermal company with the goal of reconstructing the volcanic history of the island. Drilling is often cost-prohibitive, making this a truly unique opportunity. Nevis has received little recent attention from researchers due to the fact that it has not been active for at least 100,000 years and also because of its proximity to the highly active Montserrat, which boasts its very own volcano observatory. However, there are a number of good reasons that make this region and Nevis in particular an ideal location for further analysis. First, and most importantly, is the access to thousands of meters of drill cores that is being provided by a local geothermal company. Second, a robust earthquake catalog exists (Bengoubou-Valerius et al., 2008), so the dip and depth to the subducting slab is well known. These are fundamental parameters that influence the mechanics of a subduction zone, therefore it would be difficult to proceed if they were poorly constrained. Third, prior sampling of Nevis has been limited since Hutton and Nockolds (1978) published the only extensive petrologic study ever performed on the island. This paper contained only 43 geochemical analyses and 6 K-Ar ages, which are less reliable than more modern Ar-Ar ages. Subsequent studies tended to focus on water geochemistry (GeothermEx, 2005), geothermal potential (Geotermica Italiana, 1992; Huttrer, 1998
Reconstructing Iconic Experiments in Electrochemistry: Experiences from a History of Science Course
Eggen, Per-Odd; Kvittingen, Lise; Lykknes, Annette; Wittje, Roland
2012-01-01
The decomposition of water by electricity, and the voltaic pile as a means of generating electricity, have both held an iconic status in the history of science as well as in the history of science teaching. These experiments featured in chemistry and physics textbooks, as well as in classroom teaching, throughout the nineteenth and twentieth…
Stapert, Eugénie
2013-01-01
This study explores the role of linguistic data in the reconstruction of Dolgan (pre)history. While most ethno-linguistic groups have a longstanding history and a clear ethnic and linguistic affiliation, the formation of the Dolgans has been a relatively recent development, and their ethnic origins
DEFF Research Database (Denmark)
Brescó, Ignacio
2016-01-01
Innis’ and Brinkmann’s papers (this issue) tackle two key aspects in cultural psychology: the mediating role played by the different systems of meanings throughout history in making sense of the world, and the normative role of those systems, including psychology itself. This paper offers...... a reflection on these two issues. It begins by highlighting the contribution of psychology and history, as emerging disciplines in the 19th Century, to the creation of a normative framework for the subject of modernity according to the needs of modern nation states. It also alludes to both disciplines’ common...... pursuit of a reference point in natural science in order to achieve a status that is on a par with the latter’s. It is argued that this resulted in an objectivist stance that equates the study of memory and history with an accurate reproduction of the past, thus concealing the mediated nature of past...
Abdul Fattah, R.; Verweij, J.M.; Witmans, N.; Veen, J.H. ten
2012-01-01
3D basin modelling is used to investigate the history of maturation and hydrocarbon generation on the main platforms in the northwestern part of the offshore area of the Netherlands. The study area covers the Cleaverbank and Elbow Spit Platforms. Recently compiled maps and data are used to build the
Collin, Ross; Reich, Gabriel A.
2015-01-01
This article presents discourse analyses of two lesson plans designed for secondary school history classes. Although the plans focus on the same topic, they rely on different models of content area literacy: disciplinary literacy, or reading and writing like experts in a given domain, and critical literacy, or reading and writing to address…
Reconstruction of exposure histories of meteorites from Antarctica and the Sahara
Energy Technology Data Exchange (ETDEWEB)
Neupert, U.; Neumann, S.; Leya, I.; Michel, R. [Hannover Univ. (Germany). Zentraleinrichtung fuer Strahlenschutz (ZfS); Kubik, P.W. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Bonani, G.; Hajdas, I.; Suter, M. [Eidgenoessische Technische Hochschule, Zurich (Switzerland)
1997-09-01
{sup 10}Be, {sup 14}C, and {sup 26}Al were analyzed in H-, L-, and LL-chondrites from the Acfer region in the Algerian Sahara and from the Allan Hills/Antarctica. Exposure histories and terrestrial ages could be determined. (author) 3 figs., 2 refs.
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Directory of Open Access Journals (Sweden)
Yann Reynaud
Full Text Available Tuberculosis (TB remains broadly present in the Americas despite intense global efforts for its control and elimination. Starting from a large dataset comprising spoligotyping (n = 21183 isolates and 12-loci MIRU-VNTRs data (n = 4022 isolates from a total of 31 countries of the Americas (data extracted from the SITVIT2 database, this study aimed to get an overview of lineages circulating in the Americas. A total of 17119 (80.8% strains belonged to the Euro-American lineage 4, among which the most predominant genotypic family belonged to the Latin American and Mediterranean (LAM lineage (n = 6386, 30.1% of strains. By combining classical phylogenetic analyses and Bayesian approaches, this study revealed for the first time a clear genetic structuration of LAM9 sublineage into two subpopulations named LAM9C1 and LAM9C2, with distinct genetic characteristics. LAM9C1 was predominant in Chile, Colombia and USA, while LAM9C2 was predominant in Brazil, Dominican Republic, Guadeloupe and French Guiana. Globally, LAM9C2 was characterized by higher allelic richness as compared to LAM9C1 isolates. Moreover, LAM9C2 sublineage appeared to expand close to twenty times more than LAM9C1 and showed older traces of expansion. Interestingly, a significant proportion of LAM9C2 isolates presented typical signature of ancestral LAM-RDRio MIRU-VNTR type (224226153321. Further studies based on Whole Genome Sequencing of LAM strains will provide the needed resolution to decipher the biogeographical structure and evolutionary history of this successful family.
Reynaud, Yann; Millet, Julie; Rastogi, Nalin
2015-01-01
Tuberculosis (TB) remains broadly present in the Americas despite intense global efforts for its control and elimination. Starting from a large dataset comprising spoligotyping (n = 21183 isolates) and 12-loci MIRU-VNTRs data (n = 4022 isolates) from a total of 31 countries of the Americas (data extracted from the SITVIT2 database), this study aimed to get an overview of lineages circulating in the Americas. A total of 17119 (80.8%) strains belonged to the Euro-American lineage 4, among which the most predominant genotypic family belonged to the Latin American and Mediterranean (LAM) lineage (n = 6386, 30.1% of strains). By combining classical phylogenetic analyses and Bayesian approaches, this study revealed for the first time a clear genetic structuration of LAM9 sublineage into two subpopulations named LAM9C1 and LAM9C2, with distinct genetic characteristics. LAM9C1 was predominant in Chile, Colombia and USA, while LAM9C2 was predominant in Brazil, Dominican Republic, Guadeloupe and French Guiana. Globally, LAM9C2 was characterized by higher allelic richness as compared to LAM9C1 isolates. Moreover, LAM9C2 sublineage appeared to expand close to twenty times more than LAM9C1 and showed older traces of expansion. Interestingly, a significant proportion of LAM9C2 isolates presented typical signature of ancestral LAM-RDRio MIRU-VNTR type (224226153321). Further studies based on Whole Genome Sequencing of LAM strains will provide the needed resolution to decipher the biogeographical structure and evolutionary history of this successful family.
Energy Technology Data Exchange (ETDEWEB)
Akarsu, Özgür [Department of Physics, Koç University, 34450 Sariyer, İstanbul (Turkey); Kumar, Suresh [Department of Mathematics, BITS Pilani, Pilani Campus, Rajasthan-333031 (India); Myrzakulov, R.; Sami, M. [Centre of Theoretical Physics, Jamia Millia Islamia, New Delhi-110025 (India); Xu, Lixin, E-mail: oakarsu@ku.edu.tr, E-mail: sukuyd@gmail.com, E-mail: rmyrzakulov@gmail.com, E-mail: samijamia@gmail.com, E-mail: lxxu@dlut.edu.cn [Institute of Theoretical Physics, Dalian University of Technology, Dalian, 116024 (China)
2014-01-01
In this paper, we consider a simple form of expansion history of Universe referred to as the hybrid expansion law - a product of power-law and exponential type of functions. The ansatz by construction mimics the power-law and de Sitter cosmologies as special cases but also provides an elegant description of the transition from deceleration to cosmic acceleration. We point out the Brans-Dicke realization of the cosmic history under consideration. We construct potentials for quintessence, phantom and tachyon fields, which can give rise to the hybrid expansion law in general relativity. We investigate observational constraints on the model with hybrid expansion law applied to late time acceleration as well as to early Universe a la nucleosynthesis.
Feng, Chao-Jun
2016-01-01
To probe the late evolution history of the Universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis (PCA) and the other is build by taking the multidimensional scaling (MDS) approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis are optimized for different kinds of cosmological models that based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that projected from the basis systems is cosmological model independent, and it provide a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray bursts (GRBs) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt (LM) technique and the Markov Chain Monte Carlo (MCMC) method, we perform an ...
ORIGIN OF SURGERY: A HISTORY OF EXPLORATION OF PLASTIC AND RECONSTRUCTIVE SURGERY
Directory of Open Access Journals (Sweden)
Panigrahi
2013-10-01
Full Text Available The Sushruta Samhita, Charaka samhita and Astanga sangraha are the tri compendia of Ayurveda encompassing all the aspects of Ayurveda. Plastic surgery (Sandhana karma is a very old branch of surgery described since Vedic era. Almost all the Samhitas described about the methods of Sandhana Karma (Plastic and reconstructive surgery. Now a day the world recognizes the pioneering nature of Sushruta’s plastic and reconstructive surgery and Sushruta is regarded as Father of Plastic surgery. The plastic operations of ear (otoplasty and Rhinoplasty (Plastic Surgery of nose are described in the 16th Chapter of first book (Sutrasthan of the compendium. First methods are described for piercing the earlobes of an infant which still is a widespread practice in India. Sushruta has described 15 methods of joining these cup-up ear lobes. For this plastic operation called karna bedha, a piece of skin was taken from the cheek, turned back, and suitably stitches the lobules. Sushruta has also described Rhinoplasty (Nasa Sandhana. Portion of the nose to be covered should be first measured with a leaf. Then a piece of skin of the required size should be dissected from the living skin of the cheek and turned back to cover the nose, keeping a small pedicle attaches to the cheek. The part of nose to which the skin is to be attached should be made raw by cutting of the nasal stump with a knife. The surgeon then should place the skin on the nose and stitch the two parts swiftly, keeping the skin properly elevated by inserting two tubes of eranda (castor oil plant in each part of nostril so that the new nose gets proper shape. It should then be sprinkled with a powder composed of liquorices, red sandalwood, and barberry plant. Finally it should be covered with cotton and clean sesame oil should be constantly applied to it. Sushruta also mentioned the reconstruction of the broken lip and hare-lip (ostha Sandhana.These descriptions are brief in nature in comparison to
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Reconstructing the star formation history of the Milky Way disc(s) from chemical abundances
Snaith, O; Di Matteo, P; Lehnert, M D; Combes, F; Katz, D; Gómez, A
2014-01-01
We develop a chemical evolution model in order to study the star formation history of the Milky Way. Our model assumes that the Milky Way is formed from a closed box-like system in the inner regions, while the outer parts of the disc experience some accretion. Unlike the usual procedure, we do not fix the star formation prescription (e.g. Kennicutt law) in order to reproduce the chemical abundance trends. Instead, we fit the abundance trends with age in order to recover the star formation history of the Galaxy. Our method enables one to recover with unprecedented accuracy the star formation history of the Milky Way in the first Gyrs, in both the inner (R9-10kpc) discs as sampled in the solar vicinity. We show that, in the inner disc, half of the stellar mass formed during the thick disc phase, in the first 4-5 Gyr. This phase was followed by a significant dip in the star formation activity (at 8-9 Gyr) and a period of roughly constant lower level star formation for the remaining 8 Gyr. The thick disc phase ha...
Witt, Kelsey E; Judd, Kathleen; Kitchen, Andrew; Grier, Colin; Kohler, Timothy A; Ortman, Scott G; Kemp, Brian M; Malhi, Ripan S
2015-02-01
As dogs have traveled with humans to every continent, they can potentially serve as an excellent proxy when studying human migration history. Past genetic studies into the origins of Native American dogs have used portions of the hypervariable region (HVR) of mitochondrial DNA (mtDNA) to indicate that prior to European contact the dogs of Native Americans originated in Eurasia. In this study, we summarize past DNA studies of both humans and dogs to discuss their population histories in the Americas. We then sequenced a portion of the mtDNA HVR of 42 pre-Columbian dogs from three sites located in Illinois, coastal British Columbia, and Colorado, and identify four novel dog mtDNA haplotypes. Next, we analyzed a dataset comprised of all available ancient dog sequences from the Americas to infer the pre-Columbian population history of dogs in the Americas. Interestingly, we found low levels of genetic diversity for some populations consistent with the possibility of deliberate breeding practices. Furthermore, we identified multiple putative founding haplotypes in addition to dog haplotypes that closely resemble those of wolves, suggesting admixture with North American wolves or perhaps a second domestication of canids in the Americas. Notably, initial effective population size estimates suggest at least 1000 female dogs likely existed in the Americas at the time of the first known canid burial, and that population size increased gradually over time before stabilizing roughly 1200 years before present.
Neuberger, Ferdinand M; Jopp, Eilin; Graw, Matthias; Püschel, Klaus; Grupe, Gisela
2013-03-10
The diagnosis of starvation in children or adults is an important topic in paediatric and geriatric medicine, and in law assessment. To date, few reliable techniques are available to reconstruct the onset and duration of undernourishment, especially in cases of wilful neglect or abuse. The intention of this research project is to introduce a method based on isotopic analysis to reconstruct nutritional life histories and to detect starvation. For this purpose the specific signature of stable carbon and nitrogen isotopes in human hair samples is investigated and measured in the course of serious nutritional deprivation. Previous study of our research group on anorectic patients has shown that incremental hair analyses can monitor the individual nutritional status of each patient. Increasing δ(15)N-values indicate the catabolism of bodily protein and are associated with a very low BMI. In contrast, the changes of the δ(13)C values and BMI were in phase, which can be linked to the lack of energy in the consumed diet and the break down of body fat deposits. These findings were now applied to various forensic cases, in which severe starvation occurred recently prior to death. We are aiming at establishing an unbiased biomarker to identify the individual timeframe of nutritional deprivation to detect and prevent starvation.
Reconstructing the Solar Wind From Its Early History To Current Epoch
Airapetian, Vladimir S
2016-01-01
Stellar winds from active solar type stars can play a crucial role in removal of stellar angular momentum and erosion of planetary atmospheres. However, major wind properties except for mass loss rates cannot be directly derived from observations. We employed a three dimensional magnetohydrodynamic Alfven wave driven solar wind model, ALF3D, to reconstruct the solar wind parameters including the mass loss rate, terminal velocity and wind temperature at 0.7, 2 and 4.65 Gyr. Our model treats the wind thermal electrons, protons and pickup protons as separate fluids and incorporates turbulence transport, eddy viscosity, turbulent resistivity, and turbulent heating to properly describe proton and electron temperatures of the solar wind. To study the evolution of the solar wind, we specified three input model parameters, the plasma density, Alfven wave amplitude and the strength of the dipole magnetic field at the wind base for each of three solar wind evolution models that are consistent with observational constra...
DEFF Research Database (Denmark)
Poulsen, Jens Aage
Historie i serien handler om læreplaner og læremidler og deres brug i skolefaget historie. Bogen indeholder nyttige redskaber til at analysere og vurdere læremidler......Historie i serien handler om læreplaner og læremidler og deres brug i skolefaget historie. Bogen indeholder nyttige redskaber til at analysere og vurdere læremidler...
Distance-Based Phylogeny Reconstruction: Safety and Edge Radius
Gascuel, Olivier; Pardi, Fabio; Truszkowski, Jakub
2015-01-01
International audience; A phylogeny is an evolutionary tree tracing the shared history, including common ancestors, of a set of extant species or “taxa”. Phylogenies are increasingly reconstructed on the basis of molecular data (DNA and protein sequences) using statistical techniques such as likelihood and Bayesian methods. Algorithmically, these techniques suffer from the discrete nature of tree topology space. Since the number of tree topologies increases exponentially as a function of the ...
Lee, Duane M.; Johnston, Kathryn V.; Sen, Bodhisattva; Jessop, Will
2016-08-01
In this study we tested the prospects of using 2D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from eleven ``MW-like'' halos to generate satellite template sets of 2D CARDs of accreted dwarf satellites which are comprised of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of ~ 103-4 mock observations of stellar chemical abundance ratios ([α/Fe], [Fe/H]) from those eleven halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion histories in relation to the satellite template set (STS) used and the sample size. For certain STS used we typically can identify the relative mass contributions of all accreted satellites to within a factor of 2. We also find that this method is particularly sensitive to older accretion events involving low-luminous dwarfs e.g. ultra-faint dwarfs - precisely those events that are too ancient to be seen by phase-space studies of stars and too faint to be seen by high-z studies of the early Universe. Since our results only exploit two chemical dimensions and near-future surveys promise to provide ~ 6-9 dimensions, we conclude that these new high-resolution spectroscopic surveys of the stellar halo will allow us (given the development of new CARD-generating dwarf models) to recover the luminosity function of infalling dwarf galaxies - and the detailed accretion history of the halo - across cosmic time.
Directory of Open Access Journals (Sweden)
Z. Y. Wu
2011-09-01
Full Text Available The 1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We have developed a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As a result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, which indicates that the SMAPI is capable of identifying the onset of a drought event, its progression, as well as its termination. Spatial and temporal variations of droughts in China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions (Inner Mongolia, Northeast and North from 1951–2009. However, the decadal variability of droughts has been weak in the rest of the five regions (South, Southwest, East, Northwest, and Tibet. Xinjiang has even been showing steadily wetter since the 1950s. Two regional dry centres are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first centre is located in the area partially covered by the North
Directory of Open Access Journals (Sweden)
Z. Y. Wu
2011-02-01
Full Text Available The recent fifty-nine year (1951–2009 drought history of China is reconstructed using daily soil moisture values generated by the Variable Infiltration Capacity (VIC land surface macroscale hydrology model. VIC is applied over a grid of 10 458 points with a spatial resolution of 30 km × 30 km, and is driven by observed daily maximum and minimum air temperature and precipitation from 624 long-term meteorological stations. The VIC soil moisture is used to calculate the Soil Moisture Anomaly Percentage Index (SMAPI, which can be used as a measure of the severity of agricultural drought on a global basis. We develop a SMAPI-based drought identification procedure for practical uses in the identification of both grid point and regional drought events. As the result, a total of 325 regional drought events varying in time and strength are identified from China's nine drought study regions. These drought events can thus be assessed quantitatively at different spatial and temporal scales. The result shows that the severe drought events of 1978, 2000 and 2006 are well reconstructed, indicating SMAPI is capable of indentifying the onset of a drought event, its progressing, as well as its ending. Spatial and temporal variations of droughts on China's nine drought study regions are studied. Our result shows that on average, up to 30% of the total area of China is prone to drought. Regionally, an upward trend in drought-affected areas has been detected in three regions Inner Mongolia, Northeast and North during the recent fifty-nine years. However, the decadal variability of droughts has been week in the rest five regions South, Southwest, East, Northwest, and Tibet. Xinjiang has even been wetting steadily since the 1950s. Two regional dry centers are discovered in China as the result of a combined analysis on the occurrence of drought events from both grid points and drought study regions. The first center is located in the area partially covered by two
DEFF Research Database (Denmark)
Brescó, Ignacio
2016-01-01
Innis’ and Brinkmann’s papers (this issue) tackle two key aspects in cultural psychology: the mediating role played by the different systems of meanings throughout history in making sense of the world, and the normative role of those systems, including psychology itself. This paper offers...... accounts. Drawing on this assumption, it is discussed how past events are constructed, thus bringing mediation and meaning-making to the fore. Special attention is paid to narratives as symbolic meaning-making tools. We will conclude by discussing usage of the past and the role that cultural psychology can...
Yan, B.; Han, Y.; Peteet, D. M.
2013-12-01
Biomass burning has become recognized as one of key elements of climate change. The occurrence of fires is a complex function of climate, moisture, vegetation and landscape type. Fires impact environments in multiple ways, e.g., increase in soil erosion, change of vegetation type, and increase in nutrient levels in soils and lakes that receive runoff from burned areas. Sediment cores that contain an archive of deposition of combustion products can help reconstruct the history of past fires. In this study, alkylated PAHs and black carbon (char and soot) were used to explore the paleofire history reflected in a sediment core collected from Linsley Pond, Connecticut (41°18'N, 72 °45'W). Biomass type and combustion levels of these fires and whether they occurred locally or regionally can be derived from these indicators. Such details, together with other paleoenvironmental indicators recorded in sediment cores (e.g., pollen, macrofossils, and LOI) helped unravel the environmental conditions before and after fires. Alkanes, PAHs, alkylated PAHs, and the ratio of soot to char indicate that in the Younger Dryas, fire occurred at a relatively low temperature (i.e. smoldering), followed by an abrupt increase of flaming combustion of softwood (white pine) at the Holocene boundary. Our paleofire data supports the previous interpretations of a shift towards a warm and dry climate in the southern New England region at this time.
Energy Technology Data Exchange (ETDEWEB)
Stawinski, G
1998-10-26
Bayesian algorithms are developed to solve inverse problems in gamma imaging and photofission tomography. The first part of this work is devoted to the modeling of our measurement systems. Two models have been found for both applications: the first one is a simple conventional model and the second one is a cascaded point process model. EM and MCMC Bayesian algorithms for image restoration and image reconstruction have been developed for these models and compared. The cascaded point process model does not improve significantly the results previously obtained by the classical model. To original approaches have been proposed, which increase the results previously obtained. The first approach uses an inhomogeneous Markov Random Field as a prior law, and makes the regularization parameter spatially vary. However, the problem of the estimation of hyper-parameters has not been solved. In the case of the deconvolution of point sources, a second approach has been proposed, which introduces a high level prior model. The picture is modeled as a list of objects, whose parameters and number are unknown. The results obtained with this method are more accurate than those obtained with the conventional Markov Random Field prior model and require less computational costs. (author)
Reconstructing the evolutionary history of China: a caveat about inferences drawn from ancient DNA.
Yao, Yong-Gang; Kong, Qing-Peng; Man, Xiao-Yong; Bandelt, Hans-Jürgen; Zhang, Ya-Ping
2003-02-01
The decipherment of the meager information provided by short fragments of ancient mitochondrial DNA (mtDNA) is notoriously difficult but is regarded as a most promising way toward reconstructing the past from the genetic perspective. By haplogroup-specific hypervariable segment (HVS) motif search and matching or near-matching with available modern data sets, most of the ancient mtDNAs can be tentatively assigned to haplogroups, which are often subcontinent specific. Further typing for mtDNA haplogroup-diagnostic coding region polymorphisms, however, is indispensable for establishing the geographic/genetic affinities of ancient samples with less ambiguity. In the present study, we sequenced a fragment (approximately 982 bp) of the mtDNA control region in 76 Han individuals from Taian, Shandong, China, and we combined these data with previously reported samples from Zibo and Qingdao, Shandong. The reanalysis of two previously published ancient mtDNA population data sets from Linzi (same province) then indicates that the ancient populations had features in common with the modern populations from south China rather than any specific affinity to the European mtDNA pool. Our results highlight that ancient mtDNA data obtained under different sampling schemes and subject to potential contamination can easily create the impression of drastic spatiotemporal changes in the genetic structure of a regional population during the past few thousand years if inappropriate methods of data analysis are employed.
How to make judicious use of current physics in reconstructing its history
Janssen, Michel
2013-04-01
Using three concrete examples, I illustrate both benefits and pitfalls of approaching the history of relativity and quantum theory with current textbook knowledge of these subjects. First, I show how knowing something about energy-momentum tensors in special relativity makes it easy to see that special relativity did not, as has been suggested, simply kill the program of Abraham and others at the beginning of the 20th century to reduce all of physics to electrodynamics, but co-opted key elements of it. Second, I show how knowing something about coordinate conditions in general relativity can be an obstacle to seeing why Einstein initially rejected field equations based on the Ricci tensor. Third, I show how knowing something about Hilbert space can be an obstacle to seeing the logic behind Jordan's statistical transformation theory. These three examples suggest that knowledge of modern physics is beneficial for historians, but only when used judiciously.
Fiction as Reconstruction of History: Narratives of the Civil War in American Literature
Directory of Open Access Journals (Sweden)
Reinhard Isensee
2009-09-01
Full Text Available Even after more than 140 years the American Civil War continues to serve as a major source of inspiration for a plethora of literature in various genres. While only amounting to a brief period in American history in terms of years, this war has proved to be one of the central moments for defining the American nation since the second half of the nineteenth century. The facets of the Civil War, its protagonists, places, events, and political, social and cultural underpinnings seem to hold an ongoing fascination for both academic studies and fictional representations. Thus, it has been considered by many the most written-about war in the United States.
Pasquato, Mario
2016-01-01
Context. Machine-Learning (ML) solves problems by learning patterns from data, with limited or no human guidance. In Astronomy, it is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims. We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify Globular Clusters (GCs) that may have a history of merging from observational data. Methods. We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After dimensionality reduction on the feature space, the resulting datapoints are fed to various classification algorithms. Using repeated random subsampling validation we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results. The three algorithms we considered (C5.0 tree...
Mahamud, Kira; Martínez Ruiz-Funes, María José
2014-01-01
This paper describes a study dealing with the reconstruction of the lives of two Spanish primary school teachers during the Franco dictatorship (1939-1975), in order to learn to what extent such a field of research can contribute to the history of education. Two family archives provide extraordinary and unique documentation to track down their…
Brito, Angmary; Rodriguez, Maria A.; Niaz, Mansoor
2005-01-01
The objectives of this study are: (a) elaboration of a history and philosophy of science (HPS) framework based on a reconstruction of the development of the periodic table; (b) formulation of seven criteria based on the framework; and (c) evaluation of 57 freshman college-level general chemistry textbooks with respect to the presentation of the…
Research Progress on Reconstruction of Paleofire History%古火灾历史重建的研究进展
Institute of Scientific and Technical Information of China (English)
占长林; 曹军骥; 韩永明; 安芷生
2011-01-01
火是地球系统的重要组成部分，与气候、植被、生物地球化学循环和人类活动密切相关。火对全球气候和生态系统的影响已成为目前全球变化研究的一个热点。火灾发生后会在周围的环境中留下许多燃烧产物，如黑碳、木炭屑、多环芳烃、左旋葡萄糖等，它们广泛存在于海洋、湖泊、河流、土壤和陆地风成沉积物中；还会留下一些火灾痕迹，如树木火疤、土壤磁学参数的改变。通过这些记录不仅可以反映火活动的历史，同时能够反映历史时期的气候条件及植被格局，为火灾历史重建提供宝贵的信息资料。主要总结目前国内外火灾历史重建中常用的一些替代性指标及存在的问题，同时指出古火历史重建未来的发展方向。虽然不同的替代性指标在以往的研究中都得到了成功应用，但是不同的记录由于受人为或生物扰动的影响，反映的时间及空间尺度上的差异性，使这些指标在重建火灾历史中都存在一定的局限性，不利于正确理解火灾与人类活动、气候变化和植被之间的相互关系。%Fire is an important part Of the Earth system. It is tightly coupled with climate, vegetation, biogeochemical cycles and human activities. The influences of fire on the global climate and ecosystems have become a hot topic in global change research. After the fire, many combustion products remain in the surrounding environment, such as black carbon, charcoal, Polycyclic aromatic hydrocarbons and levoglucosan, which are widely found in oceans, lakes, rivers, soils and terrestrial eolian sediments. Moreover, some traces like fire scars of trees, variation of soil magnetic parameters also remain in these records. These records not only reflected the history of paleofire activity, but also the climatic conditions and vegetation patterns in historical period, so as to provide clues for reconstruction of paleofire and
Lee, Duane M; Sen, Bodhisattva; Jessop, Will
2014-01-01
Observational studies of halo stars during the last two decades have placed some limits on the quantity and nature of accreted dwarf galaxy contributions to the Milky Way stellar halo by typically utilizing stellar phase-space information to identify the most recent halo accretion events. In this study we tested the prospects of using 2-D chemical abundance ratio distributions (CARDs) found in stars of the stellar halo to determine its formation history. First, we used simulated data from eleven "MW-like" halos to generate satellite template sets of 2-D CARDs of accreted dwarf satellites which are comprised of accreted dwarfs from various mass regimes and epochs of accretion. Next, we randomly drew samples of $\\sim10^{3-4}$ mock observations of stellar chemical abundance ratios ([$\\alpha$/Fe], [Fe/H]) from those eleven halos to generate samples of the underlying densities for our CARDs to be compared to our templates in our analysis. Finally, we used the expectation-maximization algorithm to derive accretion ...
Pasquato, Mario; Chung, Chul
2016-05-01
Context. Machine-learning (ML) solves problems by learning patterns from data with limited or no human guidance. In astronomy, ML is mainly applied to large observational datasets, e.g. for morphological galaxy classification. Aims: We apply ML to gravitational N-body simulations of star clusters that are either formed by merging two progenitors or evolved in isolation, planning to later identify globular clusters (GCs) that may have a history of merging from observational data. Methods: We create mock-observations from simulated GCs, from which we measure a set of parameters (also called features in the machine-learning field). After carrying out dimensionality reduction on the feature space, the resulting datapoints are fed in to various classification algorithms. Using repeated random subsampling validation, we check whether the groups identified by the algorithms correspond to the underlying physical distinction between mergers and monolithically evolved simulations. Results: The three algorithms we considered (C5.0 trees, k-nearest neighbour, and support-vector machines) all achieve a test misclassification rate of about 10% without parameter tuning, with support-vector machines slightly outperforming the others. The first principal component of feature space correlates with cluster concentration. If we exclude it from the regression, the performance of the algorithms is only slightly reduced.
Bernardo, Jose M
2000-01-01
This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica
The reconstructive study in arcaheology: case histories in the communication issues
Directory of Open Access Journals (Sweden)
Francesco Gabellone
2011-09-01
Full Text Available EnThe most significant results obtained by Information Technologies Lab (IBAM CNR - ITLab in the construction of VR-based knowledge platforms have been achieved in projects such as ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, etc. These projects were guided by the belief that in order to be effective, the process of communicating Cultural Heritage to the wider public should be as free as possible from the sterile old VR interfaces of the 1990s. In operational terms, this translates into solutions that are as lifelike as possible and guarantee the maximum emotional involvement of the viewer, adopting the same techniques as are used in modern cinema. Communication thus becomes entertainment and a vehicle for high-quality content, aimed at the widest possible public and produced with the help of interdisciplinary tools and methods. In this context, high-end technologies are no longer the goal of research; rather they are the invisible engine of an unstoppable process that is making it harder and harder to distinguish between computer images and real objects. An emblematic case in this regard is the reconstructive study of ancient contexts, where three-dimensional graphics compensate for the limited expressive potential of two-dimensional drawings and allows for interpretative and representative solutions that were unimaginable a few years ago. The virtual space thus becomes an important opportunity for reflection and study, as well as constituting a revolutionary way to learn for the wider public.ItI risultati più significativi ottenuti dall’Information Technologies Lab (IBAM CNR - ITLab nella costruzione di piattaforme di conoscenza basate sulla Realtà Virtuale, sono stati conseguiti nell’ambito di progetti internazionali quali ByHeriNet, Archeotour, Interadria, Interreg Greece-Italy, Iraq Virtual Museum, ecc. Il nostro lavoro in questi progetti è costantemente caratterizzato dalla convinzione che l
From retrodiction to Bayesian quantum imaging
Speirits, Fiona C.; Sonnleitner, Matthias; Barnett, Stephen M.
2017-04-01
We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values.
Villalba, Jesús
2015-01-01
In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.
Prior approval: the growth of Bayesian methods in psychology.
Andrews, Mark; Baguley, Thom
2013-02-01
Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.
Saintot, Aline; Stephenson, Randell; Brem, Arjan; Stovba, Sergiy; Privalov, Vitaliy
2003-10-01
In the WNW-ESE Donbas fold belt (DF), inversion of 3500 microtectonic data collected at 135 sites, in Proterozoic, Devonian, Carboniferous, and Cretaceous competent rocks allowed reconstruction of 123 local stress states. Accordingly, four successive paleostress fields reveal the tectonic evolution of the DF. At the numerous sites that have been affected by polyphase tectonics, the chronology between local paleostress states (also paleostress fields) was established using classical criteria (crosscutting striae, pre- or post-folding stress states, stratigraphic control). The oldest event is an extensional stress field with NNE-SSW σ3. It corresponds to the rifting phases that generated the basin in Devonian times and its early Visean reactivation. Later, the DF was affected by a transtension, with NW-SE σ3 characterizing Early Permian tectonism, including the development of the "Main Anticline" of the DF and the pronounced uplift of its southern margin and Ukrainian Shield. Two paleostress fields characterize the Cretaceous/Paleocene inversion of the DF, which was accompanied by folding and thrusting. Both are compressional in type but differ by the trend of σ1, which was first NW-SE and subsequently N-S. The discrete paleostress history of the DF allows a revised interpretation of its tectonic evolution with significant implications for understanding the geodynamic evolution of the southern margin of the East European Craton.
Feng, Chao-Jun; Li, Xin-Zhou
2016-04-01
To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg-Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.
Energy Technology Data Exchange (ETDEWEB)
Feng, Chao-Jun; Li, Xin-Zhou, E-mail: fengcj@shnu.edu.cn, E-mail: kychz@shnu.edu.cn [Shanghai United Center for Astrophysics (SUCA), Shanghai Normal University, 100 Guilin Road, Shanghai 200234 (China)
2016-04-10
To probe the late evolution history of the universe, we adopt two kinds of optimal basis systems. One of them is constructed by performing the principle component analysis, and the other is built by taking the multidimensional scaling approach. Cosmological observables such as the luminosity distance can be decomposed into these basis systems. These basis systems are optimized for different kinds of cosmological models that are based on different physical assumptions, even for a mixture model of them. Therefore, the so-called feature space that is projected from the basis systems is cosmological model independent, and it provides a parameterization for studying and reconstructing the Hubble expansion rate from the supernova luminosity distance and even gamma-ray burst (GRB) data with self-calibration. The circular problem when using GRBs as cosmological candles is naturally eliminated in this procedure. By using the Levenberg–Marquardt technique and the Markov Chain Monte Carlo method, we perform an observational constraint on this kind of parameterization. The data we used include the “joint light-curve analysis” data set that consists of 740 Type Ia supernovae and 109 long GRBs with the well-known Amati relation.
Graca, Bożena; Staniszewska, Marta; Zakrzewska, Danuta; Zalewska, Tamara
2016-06-01
This paper reports the reconstruction of the pollution history of 4-tert-octylphenol (OP) and 4-nonylphenol (NP) in the Baltic Sea. Alkylphenols are endocrine-disrupting compound and therefore toxic to aquatic organisms. Sediment cores were collected from regions with relatively stable sedimentation conditions. The cores were dated by the (210)Pb method. The OP and NP were determined using HPLC-FL. The highest inventory of these compounds was observed in the Gotland Deep (610 μg m(2) of NP and 47 μg m(2) of OP) and the lowest-on the slope of the Gdansk Deep (24 μg m(2) of NP and 16 μg m(2) of OP). Such spatial distribution was probably, among other factors, the result of the uplift of the sea floor. The pollution trends of OP and NP in sediments coincided with the following: (1) the beginnings of eutrophication (1960s/1970s of the twentieth century) and (2) strong increase in the areal extent and volume of hypoxia and anoxia in the Baltic (present century).
On Habermas'Reconstruction of History Impetus%论哈贝马斯对历史动力的重建
Institute of Scientific and Technical Information of China (English)
孙炳炎
2013-01-01
哈贝马斯对历史动力的重建，是其重建历史唯物主义最重要的一环。判断这种重建成功与否，要回到重建理论的本原上。马克思关于历史动力的判断，既不是阶级斗争，也不是生产力，而是作为历史活动总和的生产。依哈贝马斯本人之意，重建是把理论拆开以新的形式重新加以组合，新的形式可以视为重建的方法架构。哈贝马斯对历史唯物主义的重建，是多种新形式的综合，但是对历史动力重建的方法架构却是皮亚杰的发生认识论。他以这种方法架构对马克思历史唯物主义的历史动力进行了重新审视，认为马克思的历史动力仅仅局限在生产领域。为此，他进行了扩展，提出了他的历史动力---学习过程理论。%Habermas'reconstruction of history impetus is an important step to reconstruct historical materialism .The success or failure of this reconstruction depends on the primitive of the construction theory .Marx's judgement of history impetus is the production which serves as the total of history activities rather than class struggle or productive force .In his opinion ,reconstruction means the dismantling and combination of the theory and the new form of combination can be seen as the method structure of reconstruction .Habermas'Yecon-struction of historical materialism is the combination of many forms ,but the method structure of history impetus depends on Piaget's cog-nitive development theory .He reconsiders the history impetus of Marx historical materialism by using this method and thinks that Marx history impetus lies only in production area .So he proposes his history impetus -learning process theory .
Hedlund, Jonas
2014-01-01
This paper introduces private sender information into a sender-receiver game of Bayesian persuasion with monotonic sender preferences. I derive properties of increasing differences related to the precision of signals and use these to fully characterize the set of equilibria robust to the intuitive criterion. In particular, all such equilibria are either separating, i.e., the sender's choice of signal reveals his private information to the receiver, or fully disclosing, i.e., the outcome of th...
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Dutrieux, L.; Jakovac, C. C.; Siti, L. H.; Kooistra, L.
2015-12-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The BFAST framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in regimes. To further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil. Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation. We validated the number of cultivation cycles predicted against in-situ information collected from farmers interviews, resulting in a Normalized RMSE of 0.25. Overall the method performed well, producing maps with coherent patterns. We identified various sources of error in the approach, including low data availability in the 90s and sub-object mixture of land uses. We conclude that the method holds great promise for
Dutrieux, Loïc P.; Jakovac, Catarina C.; Latifah, Siti H.; Kooistra, Lammert
2016-05-01
We developed a method to reconstruct land use history from Landsat images time-series. The method uses a breakpoint detection framework derived from the econometrics field and applicable to time-series regression models. The Breaks For Additive Season and Trend (BFAST) framework is used for defining the time-series regression models which may contain trend and phenology, hence appropriately modelling vegetation intra and inter-annual dynamics. All available Landsat data are used for a selected study area, and the time-series are partitioned into segments delimited by breakpoints. Segments can be associated to land use regimes, while the breakpoints then correspond to shifts in land use regimes. In order to further characterize these shifts, we classified the unlabelled breakpoints returned by the algorithm into their corresponding processes. We used a Random Forest classifier, trained from a set of visually interpreted time-series profiles to infer the processes and assign labels to the breakpoints. The whole approach was applied to quantifying the number of cultivation cycles in a swidden agriculture system in Brazil (state of Amazonas). Number and frequency of cultivation cycles is of particular ecological relevance in these systems since they largely affect the capacity of the forest to regenerate after land abandonment. We applied the method to a Landsat time-series of Normalized Difference Moisture Index (NDMI) spanning the 1984-2015 period and derived from it the number of cultivation cycles during that period at the individual field scale level. Agricultural fields boundaries used to apply the method were derived using a multi-temporal segmentation approach. We validated the number of cultivation cycles predicted by the method against in-situ information collected from farmers interviews, resulting in a Normalized Residual Mean Squared Error (NRMSE) of 0.25. Overall the method performed well, producing maps with coherent spatial patterns. We identified
Bock, L R; Whitledge, G W; Pracheil, B; Bailey, P
2017-02-01
The objectives of this study were to characterize relationships between water and paddlefish Polyodon spathula dentary Sr:Ca, δ(18) O and stable hydrogen isotope ratio (δD) to determine the accuracy with which individual P. spathula could be assigned to their collection locations using dentary-edge Sr:Ca, δD and δ(18) O. A laboratory experiment was also conducted to determine whether dentary Sr:Ca in age 0 year P. spathula would reflect shifts in water Sr:Ca to which fish were exposed. Significant linear relationships between water and dentary Sr:Ca, δD and δ(18) O were observed, although the relationship between water and dentary δ(18) O was weaker than those for Sr:Ca and δD. Classification success for individual fish to collection locations that differed in water Sr:Ca, δD and δ(18) O ranged from 86 to 100% based on dentary-edge Sr:Ca, δD and δ(18) O. Dentary Sr:Ca increased significantly in laboratory-reared age 0 year P. spathula following 4 weeks of exposure to elevated water Sr:Ca; dentary Sr:Ca of fish held in water with elevated Sr:Ca was also significantly higher than that of control fish reared in ambient laboratory water. Results indicated that P. spathula dentaries reflect water signatures for commonly-applied natural chemical markers and strongly suggest that dentary microchemistry and stable-isotopic compositions will be applicable for reconstructing P. spathula environmental history in locations where sufficient spatial differences in water chemistry occur.
Energy Technology Data Exchange (ETDEWEB)
Stasiuk, Lavern D.; Sweet, Art R.; Issler, Dale R. [Natural Resources Canada, Geological Survey of Canada-Calgary, 3303-33rd ST N.W., Calgary AB, Canada (T2L 2A7)
2006-01-03
Reconstruction of Mesozoic and Cenozoic sedimentary 'cover' on the Precambrian shield in the Lac de Gras diamond field, Northwest Territories, Canada, has been achieved using Cretaceous and early Tertiary sedimentary xenoliths and contemporaneous organic matter preserved in volcaniclastic sediments associated with late Cretaceous to early Tertiary kimberlite pipe intrusions, and in situ, Eocene crater lake, lacustrine and peat bog strata. Percent reflectance in oil (%Ro) of vitrinite within shale xenoliths for: (i) Albian to mid-Cenomanian to Turonian ranges from >0.27 to 0.42 %Ro (mean 0.38 %Ro), (ii) Maastrichtian to early Paleocene from 0.24 to <0.30%; (iii) latest Paleocene to early middle Eocene 0.15 to <0.23 %Ro (mean = 0.18 %Ro). These levels of thermal maturity are corroborated by Rock Eval pyrolysis T{sub max} ({sup o}C) and VIS region fluorescence of liptinites, with wavelengths of maximum emission for sporinite, prasinophyte alginite and dinoflagellates consistent with vitrinite reflectance of 0.20 to <0.50 %Ro. Burial-thermal history modeling, constrained by measured vitrinite reflectance and porosity of shale xenoliths, predicts a maximum burial temperature for Mid to Late Albian strata ({approx}115 Ma) of 60 {sup o}C with {approx}1.2 to 1.4 km of Cretaceous strata in the Lac de Gras kimberlite field region prior to major uplift and erosion, which began at 90 Ma. Late Paleocene to middle Eocene volcanic crater lake lacustrine to peat bog strata were only buried to a few hundreds of meters and are in a peat-brown coal stage of thermal maturation. (author)
Neuromagnetic source reconstruction
Energy Technology Data Exchange (ETDEWEB)
Lewis, P.S.; Mosher, J.C. [Los Alamos National Lab., NM (United States); Leahy, R.M. [University of Southern California, Los Angeles, CA (United States)
1994-12-31
In neuromagnetic source reconstruction, a functional map of neural activity is constructed from noninvasive magnetoencephalographic (MEG) measurements. The overall reconstruction problem is under-determined, so some form of source modeling must be applied. We review the two main classes of reconstruction techniques-parametric current dipole models and nonparametric distributed source reconstructions. Current dipole reconstructions use a physically plausible source model, but are limited to cases in which the neural currents are expected to be highly sparse and localized. Distributed source reconstructions can be applied to a wider variety of cases, but must incorporate an implicit source, model in order to arrive at a single reconstruction. We examine distributed source reconstruction in a Bayesian framework to highlight the implicit nonphysical Gaussian assumptions of minimum norm based reconstruction algorithms. We conclude with a brief discussion of alternative non-Gaussian approachs.
Bayesian calibration of a post-LGM model of Laurentide ice-sheet evolution
Tarasov, L.; Peltier, W. R.
2003-04-01
Though numerous inferences have been made with regard to the deglaciation history of the Wisconsin North American ice sheet complex, no attempt has been made to place objective confidence ranges on these inferences. Furthermore, past efforts to reconstruct the Wisconsin deglaciation history have relied on restricted discipline-specific constraints. Approaches based on dynamical glacial models have ignored geophysical constraints such as Relative Sea Level histories. Geophysical based reconstructions, on the other hand, have ignored glaciological self-consistency and Marine Limit data. To remedy this situation, we present a Bayesian calibration of a 3D thermo-mechanically coupled ice-sheet systems model using: 1) a large set of Relative Sea Level observations (from 415 sites), 2) Marine Limit observations, 3) a North-South transect of gravity measurements, 4) direct observations of the present day rate of basal uplift at Yellowknife, 5) and a new high-resolution ice margin chronology derived from geological and geomorphological observations. Given the large parameter space (O(20) parameters), Bayesian neural networks, trained from a thousand runs of the ice-sheet systems model, are employed to simulate the glacial model within the statistical analyses. The end result is a posterior distribution for model parameters (and thereby modelled glacial histories) given the observational data sets that thereby also takes into account data uncertainty. Strong support is provided for a multi-domed Laurentide ice-sheet. We also identify key dynamical processes (ie most relevant model parameters) along with critical geographic regions in need of further data.
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Bayesian Compressed Sensing with Unknown Measurement Noise Level
DEFF Research Database (Denmark)
Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand
2013-01-01
In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction a...
DEFF Research Database (Denmark)
Alberdi, Antton; Gilbert, M. Thomas P; Razgour, Orly;
2015-01-01
Aim: We used an integrative approach to reconstruct the evolutionary history of the alpine long-eared bat, Plecotus macrobullaris, to test whether the variable effects of Pleistocene climatic oscillations across geographical regions led to contrasting population-level demographic histories within...... a single species. Location: The Western Palaearctic. Methods: We sequenced the complete mitochondrial genomes of 57 individuals from across the distribution of the species. The analysis integrated ecological niche modelling (ENM), approximate Bayesian computation (ABC), measures of genetic diversity...
Bayesian large-scale structure inference and cosmic web analysis
Leclercq, Florent
2015-01-01
Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Energy Technology Data Exchange (ETDEWEB)
Holzhauser, H.
2008-07-01
Around the middle of the 19th century, alpine glaciers advanced to their last maximum extension within the Holocene (the last 11'600 years). Some of the glaciers, especially the Great Aletsch and Gorner, penetrated deeply into wooded land and destroyed numerous trees. Not only were trees destroyed but also valuable arable farmland, alpine farm buildings and dwelling houses. Since the last maximum extension in the 19th century the retreat of the glaciers has accelerated revealing, within the glacier forefields, the remainders of trees once buried. Some of this fossil wood is found in the place where it grew (in situ). Often the wood dates back to a time before the last glacier advance, most of it is several thousands of years old because glacial advance and retreat periods occurred repeatedly within the Holocene. This paper shows the characteristics of fossil wood and how it can be analysed to reconstruct glacial history. It will be demonstrated how glacier length variation can be exactly reconstructed with help of dendrochronology. Thanks to the very exact reconstruction of the glacier length change during the advance periods in the 14th and 16th centuries, the velocities of both the Gorner and Great Aletsch glaciers can be estimated. They range between 7-8 and 20 m per year, in the case of the Gorner glacier, and between 7-8 and 36 m per year, in the case of the Great Aletsch glacier. (author)
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
3D Surface Reconstruction and Automatic Camera Calibration
Jalobeanu, Andre
2004-01-01
Illustrations in this view-graph presentation are presented on a Bayesian approach to 3D surface reconstruction and camera calibration.Existing methods, surface analysis and modeling,preliminary surface reconstruction results, and potential applications are addressed.
Nabholz, Benoit; Lartillot, Nicolas
2013-01-01
The nearly neutral theory, which proposes that most mutations are deleterious or close to neutral, predicts that the ratio of nonsynonymous over synonymous substitution rates (dN/dS), and potentially also the ratio of radical over conservative amino acid replacement rates (Kr/Kc), are negatively correlated with effective population size. Previous empirical tests, using life-history traits (LHT) such as body-size or generation-time as proxies for population size, have been consistent with these predictions. This suggests that large-scale phylogenetic reconstructions of dN/dS or Kr/Kc might reveal interesting macroevolutionary patterns in the variation in effective population size among lineages. In this work, we further develop an integrative probabilistic framework for phylogenetic covariance analysis introduced previously, so as to estimate the correlation patterns between dN/dS, Kr/Kc, and three LHT, in mitochondrial genomes of birds and mammals. Kr/Kc displays stronger and more stable correlations with LHT than does dN/dS, which we interpret as a greater robustness of Kr/Kc, compared with dN/dS, the latter being confounded by the high saturation of the synonymous substitution rate in mitochondrial genomes. The correlation of Kr/Kc with LHT was robust when controlling for the potentially confounding effects of nucleotide compositional variation between taxa. The positive correlation of the mitochondrial Kr/Kc with LHT is compatible with previous reports, and with a nearly neutral interpretation, although alternative explanations are also possible. The Kr/Kc model was finally used for reconstructing life-history evolution in birds and mammals. This analysis suggests a fairly large-bodied ancestor in both groups. In birds, life-history evolution seems to have occurred mainly through size reduction in Neoavian birds, whereas in placental mammals, body mass evolution shows disparate trends across subclades. Altogether, our work represents a further step toward a more
Macho, Gabriele A; Lee-Thorp, Julia A
2014-01-01
Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%), relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature) until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a) more gradual and (b) earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the results highlight
Applied Bayesian Hierarchical Methods
Congdon, Peter D
2010-01-01
Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Directory of Open Access Journals (Sweden)
Gabriele A Macho
Full Text Available Factors influencing the hominoid life histories are poorly understood, and little is known about how ecological conditions modulate the pace of their development. Yet our limited understanding of these interactions underpins life history interpretations in extinct hominins. Here we determined the synchronisation of dental mineralization/eruption with brain size in a 20th century museum collection of sympatric Gorilla gorilla and Pan troglodytes from Central Cameroon. Using δ13C and δ15N of individuals' hair, we assessed whether and how differences in diet and habitat use may have impacted on ape development. The results show that, overall, gorilla hair δ13C and δ15N values are more variable than those of chimpanzees, and that gorillas are consistently lower in δ13C and δ15N compared to chimpanzees. Within a restricted, isotopically-constrained area, gorilla brain development appears delayed relative to dental mineralization/eruption [or dental development is accelerated relative to brains]: only about 87.8% of adult brain size is attained by the time first permanent molars come into occlusion, whereas it is 92.3% in chimpanzees. Even when M1s are already in full functional occlusion, gorilla brains lag behind those of chimpanzee (91% versus 96.4%, relative to tooth development. Both bootstrap analyses and stable isotope results confirm that these results are unlikely due to sampling error. Rather, δ15N values imply that gorillas are not fully weaned (physiologically mature until well after M1 are in full functional occlusion. In chimpanzees the transition from infant to adult feeding appears (a more gradual and (b earlier relative to somatic development. Taken together, the findings are consistent with life history theory that predicts delayed development when non-density dependent mortality is low, i.e. in closed habitats, and with the "risk aversion" hypothesis for frugivorous species as a means to avert starvation. Furthermore, the
DEFF Research Database (Denmark)
Sichani, Mahdi Teimouri; Brincker, Rune
2008-01-01
Computing displacements of a structure from its measured accelerations has been major concern of some fields of engineering such as earthquake engineering. In vibration engineering also displacements are preferred to acceleration histories occasionally i.e. in the determination of forces applied...
Radermacher, Pascal; Schöne, Bernd R.; Gischler, Eberhard; Oschmann, Wolfgang; Thébault, Julien; Fiebig, Jens
2010-05-01
The shell of the queen conch Strombus gigas provides a rapidly growing palaeoenvironmental proxy archive, allowing the detailed reconstruction of important life-history traits such as ontogeny, growth rate and growth seasonality. In this study, modern sclerochronological methods are used to cross-date the palaeotemperatures derived from the shell with local sea surface temperature (SST) records. The growth history of the shell suggests a bimodal seasonality in growth, with the growing season confined to the interval between April and November. In Glovers Reef, offshore Belize, the queen conch accreted shell carbonate at rates of up to 6 mm day-1 during the spring (April-June) and autumn (September-November). However a reduced period of growth occurred during the mid-summer months (July-August). The shell growth patterns indicate a positive response to annual seasonality with regards to precipitation. It seems likely that when precipitation levels are high, food availability is increased as the result of nutrient input to the ecosystem in correspondence with an increase in coastal runoff. Slow growth rates occur when precipitation, and as a consequence riverine runoff, is low. The SST however appears to influence growth only on a secondary level. Despite the bimodal growing season and the winter cessation in growth, the growth rates reconstructed here from two S. gigas shells are among the fastest yet reported for this species. The S. gigas specimens from Belize reached their final shell height (of 22.7 and 23.5 cm in distance between the apex and the siphonal notch) at the transition to adulthood in just 2 years. The extremely rapid growth as observed in this species permits detailed, high-resolution reconstructions of life-history traits where sub-daily resolutions can be achieved with ease. The potential for future studies has yet to be further explored. Queen conch sclerochronology provides an opportunity to recover extremely high-resolution palaeotemperature
Náfrádi, Katalin; Bodor, Elvira; Törőcsik, Tünde; Sümegi, Pál
2011-12-01
The significance of geoarchaeological investigations is indisputable in reconstructing the former environment and in studying the relationship between humans and their surroundings. Several disciplines have developed during the last few decades to give insight into earlier time periods and their climatic conditions (e.g. palynology, malacology, archaeobotany, phytology and animal osteology). Charcoal and pollen analytical studies from the rescue excavation of the MO motorway provide information about the vegetation changes of the past. These methods are used to reconstruct the environment of the former settlements and to detect the human impact and natural climatic changes. The sites examined span the periods of the Late-Copper Age, Late-Bronze Age, Middle-Iron Age, Late-Iron Age, Sarmatian period, Late Sarmatian period, Migration period, Late-Migration period and Middle Ages. The vegetation before the Copper Age is based only on pollen analytical data. Anthracological results show the overall dominance of Quercus and a great number of Ulmus, Fraxinus, Acer, Fagus, Alnus and Populus/Salix tree fossils, as well as the residues of fruit trees present in the charred wood assemblage.
Institute of Scientific and Technical Information of China (English)
Magnus; Widell; IHAC
2002-01-01
Ⅰ.INTRODUCTION AND BACKGROUNDIn the introduction of her recent book on Ur-Namma in the Sumerian literarytradition,E. Flckiger-Hawker offers a comprehensive account and summary ofthe previous scholarship of the complicated political history between the fall ofthe Akkadian empire and the beginning of the Ur Ⅲ period.From this account,itbecomes clear that this scholarship primarily relies on various interpretations of
Kading, T J; Mason, R P; Leaner, J J
2009-01-01
Mercury deposition histories have been scarcely documented in the southern hemisphere. A sediment core was collected from the ecologically important estuarine floodplain of the Berg River (South Africa). We establish the concentration of Hg in this (210)Pb-dated sediment core at mercury deposition to the site and reveals the onset of enhanced mercury deposition in 1970. The ratio of methylmercury to total mercury is relatively high in these sediments when compared to other wetlands.
Emmel, Benjamin
2004-01-01
Titanite and apatite fission track (FT) thermochronology on 127 basement and 18 sedimentary rock samples from central and southern Madagascar record a complex cooling and denudation history since the Early Palaeozoic. Titanite FT analyses gave ages ranging between 483 Ma and 266 Ma. Apatite FT ages vary between 460 Ma and 79 Ma. Samples from Late Carboniferous to Jurassic sediments from the Morondava basin gave apatite FT ages ranging between 462 Ma and 184 Ma. FT data argues for reactivation...
DEFF Research Database (Denmark)
Miraldo, Andreia; Hewitt, Godfrey M; Dear, Paul H
2012-01-01
In northwestern Iberia, two largely allopatric Lacerta lepida mitochondrial lineages occur, L5 occurring to the south of Douro River and L3 to the north, with a zone of putative secondary contact in the region of the Douro River valley. Cytochrome b sequence chromatograms with polymorphisms...... of secondary contact between the lineages. The additional incidence of these numts to the north of the putative contact zone is consistent with an earlier postglacial northward range expansion of L5, preceding that of L3. We show that genetic exchange between the lineages responsible for the origin...... of these numts in L3 after secondary contact occurred prior to, or coincident with, the northward expansion of L3. This study shows that, in the context of phylogeographic analysis, numts can provide evidence for past demographic events and can be useful tools for the reconstruction of complex evolutionary...
Roopnarine, P. D.; Anderson, L.; Roopnarine, D.; Gillikin, D. P.; Leal, J.
2012-12-01
The Earth's environments are changing more rapidly today than at almost any time in the Phanerozoic. These changes are driven by human activities, and include climate change, landscape alteration, fragmentation and destruction, environmental pollution, species overexploitation, and invasive species. The rapidity of the changes challenges our best efforts to document what is changing, how it has changed, and what has been lost. Central to these efforts, therefore, is the proper documentation, archiving and curation of past environments. Natural history and other research collections form the core of this documentation, and have proven vital to recent studies of environmental change. Those collections are, however, generally under-utilized and under-appreciated by the general research community. Also, their utility is hampered by insufficient availability of the data, and the very nature of what has been collected in the past. Past collections emphasized a typological approach, placing emphasis on individual specimens and diversity, whether geological or biological, while what is needed today is greater emphasis on archiving entire environments. The concept of shifting baselines establishes that even on historical time scales, the notion of what constitutes an unaltered environment is biased by a lack of documentation and understanding of environments in the recent past. Baselines are necessary, however, for the proper implementation of mitigating procedures, for environmental restoration or remediation, and for predicting the near-term future. Here we present results from a study of impacts of the Deepwater Horizon oil spill (DWH) on the American oyster Crassostrea virginica. Natural history collections of specimens from the Gulf and elsewhere have been crucial to this effort, and serve as an example of how important such collections are to current events. We are examining the effects of spill exposure on shell growth and tissue development, as well as the potential
Marin-Carbonne, Johanna; Chaussidon, Marc; Robert, François
2012-09-01
Oxygen and silicon isotopes in cherts have been extensively used for the reconstruction of seawater temperature during the Precambrian. These reconstructions have been challenged because cherts can have various origins (hydrothermal, sedimentary, volcanic silicification) and their isotopic compositions might have been reset by metamorphic fluid circulation. Existing criteria used to assess the pristine sedimentary origin of a chert are based on petrography (criterion #1: chert is composed mostly of microquartz); on the bulk oxygen isotopic composition (criterion #2: bulk δ18O has to be close enough to the maximum δ18O value previously measured in other cherts of the same age); and on the presence of a large δ18O range at the micrometer scale (criterion #3: δ18O range of ˜10‰ at ˜2 μm). However, these criteria remain incomplete in determining precisely the origin and degree of preservation of ancient cherts. We report in situ Si and O isotope compositions and trace element concentrations in seven chert samples ranging from 1.88 to 3.5 Ga in age. Correlations between δ30Si and Al2O3 (and K2O, TiO2) reveal that microquartz is of three different origins, i.e. diagenetic, hydrothermal or silicification. Moreover, chert samples composed mostly of diagenetic microquartz show a large range of δ30Si at the micrometer scale (1.7-4.5‰), consistent with the large range of δ18O previously found in the Gunflint diagenetic cherts. We propose two further quantitative criteria to assess the origin, state of preservation and diagenetic history of cherts. Criterion #4 uses trace element concentrations coupled with δ30Si to ascribe the origin of cherts among three possible end-members (diagenetic, hydrothermal, and silicified). Criterion #5 is the presence of a large range of δ30Si in pure diagenetic microquartz. In the seven samples analyzed in this study, only one (from the Gunflint Iron formation at 1.88 Ga) passes all the criteria assessed here and can be used for
Hanin, Leonid
2013-01-01
The hypothesis of early metastasis was debated for several decades. Dormant cancer cells and surgery-induced acceleration of metastatic growth were first observed in clinical studies and animal experiments conducted more than a century ago; later, these findings were confirmed in numerous modern studies.In this primarily methodological work, we discuss critically important, yet largely unobservable, aspects of the natural history of cancer, such as (1) early metastatic dissemination; (2) dormancy of secondary tumors; (3) treatment-related interruption of metastatic dormancy, induction of angiogenesis, and acceleration of the growth of vascular metastases; and (4) the existence of cancer stem cells. The hypothesis of early metastasis was debated for several decades. Dormant cancer cells and surgery-induced acceleration of metastatic growth were first observed in clinical studies and animal experiments conducted more than a century ago; later, these findings were confirmed in numerous modern studies.We focus on the unique role played by very general mathematical models of the individual natural history of cancer that are entirely mechanistic yet, somewhat paradoxically, essentially free of assumptions about specific nature of the underlying biological processes. These models make it possible to reconstruct in considerable detail the individual natural history of cancer and retrospectively assess the effects of treatment. Thus, the models can be used as a tool for generation and validation of biomedical hypotheses related to carcinogenesis, primary tumor growth, its metastatic dissemination, growth of metastases, and the effects of various treatment modalities. We discuss in detail one such general model and review the conclusions relevant to the aforementioned aspects of cancer progression that were drawn from fitting a parametric version of the model to data on the volumes of bone metastases in one breast cancer patient and 12 prostate cancer patients.
Chang, Jinfeng; Ciais, Philippe; Herrero, Mario; Havlik, Petr; Campioli, Matteo; Zhang, Xianzhou; Bai, Yongfei; Viovy, Nicolas; Joiner, Joanna; Wang, Xuhui; Peng, Shushi; Yue, Chao; Piao, Shilong; Wang, Tao; Hauglustaine, Didier A.; Soussana, Jean-Francois; Peregon, Anna; Kosykh, Natalya; Mironycheva-Tokareva, Nina
2016-06-01
Grassland management type (grazed or mown) and intensity (intensive or extensive) play a crucial role in the greenhouse gas balance and surface energy budget of this biome, both at field scale and at large spatial scale. However, global gridded historical information on grassland management intensity is not available. Combining modelled grass-biomass productivity with statistics of the grass-biomass demand by livestock, we reconstruct gridded maps of grassland management intensity from 1901 to 2012. These maps include the minimum area of managed vs. maximum area of unmanaged grasslands and the fraction of mown vs. grazed area at a resolution of 0.5° by 0.5°. The grass-biomass demand is derived from a livestock dataset for 2000, extended to cover the period 1901-2012. The grass-biomass supply (i.e. forage grass from mown grassland and biomass grazed) is simulated by the process-based model ORCHIDEE-GM driven by historical climate change, rising CO2 concentration, and changes in nitrogen fertilization. The global area of managed grassland obtained in this study increases from 6.1 × 106 km2 in 1901 to 12.3 × 106 km2 in 2000, although the expansion pathway varies between different regions. ORCHIDEE-GM also simulated augmentation in global mean productivity and herbage-use efficiency over managed grassland during the 20th century, indicating a general intensification of grassland management at global scale but with regional differences. The gridded grassland management intensity maps are model dependent because they depend on modelled productivity. Thus specific attention was given to the evaluation of modelled productivity against a series of observations from site-level net primary productivity (NPP) measurements to two global satellite products of gross primary productivity (GPP) (MODIS-GPP and SIF data). Generally, ORCHIDEE-GM captures the spatial pattern, seasonal cycle, and interannual variability of grassland productivity at global scale well and thus is
Energy Technology Data Exchange (ETDEWEB)
Faria, Gleikam Lopes de Oliveira; Menezes, Maria Angela de B.C.; Silva, Maria Aparecida, E-mail: menezes@cdtn.br, E-mail: cida@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Servico de Reator e Tecnicas Analiticas. Laboratorio de Ativacao Neutronica; Sabino, Claudia de V.S. [PUC-Minas, Belo Horizonte, MG (Brazil)
2011-07-01
Due to the high importance of the material vestiges for a culture of a nation, the Brazilian Council for Environment determined that the license to establish new enterprises are subjected to a technical report concerning environmental impact, including archaeological sites affected by that enterprise. Therefore, answering the report related to the Program for Prospection and Rescue of the Archaeological Patrimony of the Areas impacted by the installation of the Second Line of Samarco Mining Pipeline, the archaeological interventions were carried out along the coast of Espirito Santo. Tupi-Guarani Tradition vestiges were found there, where the main evidence was a interesting ceramics. Archaeology can fill the gap between ancient population and modern society elucidating the evidences found in archaeological sites. In this context, several ceramic fragments found in the archaeological sites - Hiuton and Bota-Fora - were analyzed by neutron activation technique, {sub k}0-standardization method, at CDTN using the TRIGA MARK I IPR-R1 reactor, in order to characterize their elemental composition. The elements As, Ba, Br, Ce, Co, Cr, Cs, Eu, Fe, Ga, Hf, K, La, Na, Nd, Rb, Sb, Sc, Sm, Ta, Tb, Th, U, Yb, Zn and Zr were determined. Applying R software, a robust multivariate statistical analysis, the results pointed out that the pottery from the sites was made with clay from different sources. The X-ray powder diffraction analyses were carried out to determine the mineral composition and Moessbauer spectroscopy was applied to provide information on both the degree of burning and atmosphere in order to reconstruct the Indian burning strategies temperature used on pottery production. (author)
Bayesian Games with Intentions
Directory of Open Access Journals (Sweden)
Adam Bjorndahl
2016-06-01
Full Text Available We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
An introduction to Gaussian Bayesian networks.
Grzegorczyk, Marco
2010-01-01
The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain.
Directory of Open Access Journals (Sweden)
Vincent Berthon
2013-09-01
Full Text Available Like many lakes worldwide, French sub-alpine lakes (lakes Annecy, Bourget and Geneva have suffered from eutrophication in the mid-20th century. Although restoration measures have been undertaken and resulted in significant reductions in nutrient inputs and concentrations over the last 30 years in all three lakes, the limnological monitoring does not extend back far enough to establish the reference conditions, as defined by the European Water Framework Directive. The over-arching aim of this work was to reconstruct, using a paleolimnological approach, the pre-eutrophication levels and subsequent temporal changes in the lakes trophic status over the last century. The objectives were three-fold: i to test whether fossil diatoms archived in deep sediment cores adequately reflect past changes in the planktonic diatom communities for these deep sub-alpine lakes based on data from lake Geneva; ii to investigate changes in the diatom communities over the last 150 years in the three lakes; and iii to infer the past total phosphorus (TP concentrations of the lakes from a diatom based transfer function. Annual paleolimnological and limnological diatom countings for lake Geneva were strongly correlated over the last 30 years. Most notable differences essentially resulted from both taphonomic and depositional biases, as evidenced by the underestimation of thin skeleton species such as Asterionella formosa and Diatoma tenuis in the paleolimnological dataset and the presence of many benthic taxa. The fossil diatom records revealed shifts in the communities in the three lakes over time, most of which were changes typically associated with nutrient enrichment. Indeed, in all three lakes, the proportion of Cyclotella spp. was very high before the 1950s, but these species were then replaced by more eutrophic taxa, such as Stephanodiscus spp, by the mid-20th century. From the 1980s, some but not all diatom species typical of re-oligotrophicated conditions (i
Danišík, Martin; McInnes, Brent I. A.; Kirkland, Christopher L.; McDonald, Brad J.; Evans, Noreen J.; Becker, Thomas
2017-01-01
Zircon (U-Th)/He thermochronometry is an established radiometric dating technique used to place temporal constraints on a range of thermally sensitive geological events, such as crustal exhumation, volcanism, meteorite impact, and ore genesis. Isotopic, crystallographic, and/or mineralogical heterogeneities within analyzed grains can result in dispersed or anomalous (U-Th)/He ages. Understanding the effect of these grain-scale phenomena on the distribution of He in analyzed minerals should lead to improvements in data interpretation. We combine laser ablation microsampling and noble gas and trace element mass spectrometry to provide the first two-dimensional, grain-scale zircon He “maps” and quantify intragrain He distribution. These maps illustrate the complexity of intracrystalline He distribution in natural zircon and, combined with a correlated quantification of parent nuclide (U and Th) distribution, provide an opportunity to assess a number of crystal chemistry processes that can generate anomalous zircon (U-Th)/He ages. The technique provides new insights into fluid inclusions as potential traps of radiogenic He and confirms the effect of heterogeneity in parent-daughter isotope abundances and metamictization on (U-Th)/He systematics. Finally, we present a new inversion method where the He, U, and Th mapping data can be used to constrain the high- and low-temperature history of a single zircon crystal. PMID:28246632
Shamshiri, Sorour; Henriques, Bruno M; Tojeiro, Rita; Lemson, Gerard; Oliver, Seb J; Wilkins, Stephen
2015-01-01
We adapt the L-Galaxies semi-analytic model to follow the star-formation histories (SFH) of galaxies -- by which we mean a record of the formation time and metallicities of the stars that are present in each galaxy at a given time. We use these to construct stellar spectra in post-processing, which offers large efficiency savings and allows user-defined spectral bands and dust models to be applied to data stored in the Millennium data repository. We contrast model SFHs from the Millennium Simulation with observed ones from the VESPA algorithm as applied to the SDSS-7 catalogue. The overall agreement is good, with both simulated and SDSS galaxies showing a steeper SFH with increased stellar mass. The SFHs of blue and red galaxies, however, show poor agreement between data and simulations, which may indicate that the termination of star formation is too abrupt in the models. The mean star-formation rate (SFR) of model galaxies is well-defined and is accurately modelled by a double power law at all redshifts: SF...
Dudová, Lydie; Hájková, Petra; Opravilová, Věra; Hájek, Michal
2014-07-01
We discovered the first peat section covering the entire Holocene in the Hrubý Jeseník Mountains, representing an island of unique alpine vegetation whose history may display transitional features between the Hercynian and Carpathian regions. We analysed pollen, plant macrofossils (more abundant in bottom layers), testate amoebae (more abundant in upper layers), peat stratigraphy and chemistry. We found that the landscape development indeed differed from other Hercynian mountains located westward. This is represented by Pinus cembra and Larix during the Pleistocene/Holocene transition, the early expansion of spruce around 10,450 cal yr BP, and survival of Larix during the climatic optimum. The early Holocene climatic fluctuations are traced in our profile by species compositions of both the mire and surrounding forests. The mire started to develop as a calcium-rich percolation fen with some species recently considered to be postglacial relicts (Meesia triquetra, Betula nana), shifted into ombrotrophy around 7450 cal yr BP by autogenic succession and changed into a pauperised, nutrient-enriched spruce woodland due to modern forestry activities. We therefore concluded that its recent vegetation is not a product of natural processes. From a methodological viewpoint we demonstrated how using multiple biotic proxies and extensive training sets in transfer functions may overcome taphonomic problems.
Marcisz, Katarzyna; Tinner, Willy; Colombaroli, Daniele; Kołaczek, Piotr; Słowiński, Michał; Fiałkiewicz-Kozieł, Barbara; Łokas, Edyta; Lamentowicz, Mariusz
2015-03-01
Sphagnum peatlands in the oceanic-continental transition zone of Poland are currently influenced by climatic and anthropogenic factors that lead to peat desiccation and susceptibility to fire. Little is known about the response of Sphagnum peatland testate amoebae (TA) to the combined effects of drought and fire. To understand the relationships between hydrology and fire dynamics, we used high-resolution multi-proxy palaeoecological data to reconstruct 2000 years of mire history in northern Poland. We employed a new approach for Polish peatlands - joint TA-based water table depth and charcoal-inferred fire activity reconstructions. In addition, the response of most abundant TA hydrological indicators to charcoal-inferred fire activity was assessed. The results show four hydrological stages of peatland development: moderately wet (from ˜35 BC to 800 AD), wet (from ˜800 to 1390 AD), dry (from ˜1390 to 1700 AD) and with an instable water table (from ˜1700 to 2012 AD). Fire activity has increased in the last millennium after constant human presence in the mire surroundings. Higher fire activity caused a rise in the water table, but later an abrupt drought appeared at the onset of the Little Ice Age. This dry phase is characterized by high ash contents and high charcoal-inferred fire activity. Fires preceded hydrological change and the response of TA to fire was indirect. Peatland drying and hydrological instability was connected with TA community changes from wet (dominance of Archerella flavum, Hyalosphenia papilio, Amphitrema wrightianum) to dry (dominance of Cryptodifflugia oviformis, Euglypha rotunda); however, no clear fire indicator species was found. Anthropogenic activities can increase peat fires and cause substantial hydrology changes. Our data suggest that increased human fire activity was one of the main factors that influenced peatland hydrology, though the mire response through hydrological changes towards drier conditions was delayed in relation to
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Hara, Yuichiro; Imanishi, Tadashi; Satta, Yoko
2012-01-01
The demographic history of human would provide helpful information for identifying the evolutionary events that shaped the humanity but remains controversial even in the genomic era. To settle the controversies, we inferred the speciation times (T) and ancestral population sizes (N) in the lineage leading to human and great apes based on whole-genome alignment. A coalescence simulation determined the sizes of alignment blocks and intervals between them required to obtain recombination-free blocks with a high frequency. This simulation revealed that the size of the block strongly affects the parameter inference, indicating that recombination is an important factor for achieving optimum parameter inference. From the whole genome alignments (1.9 giga-bases) of human (H), chimpanzee (C), gorilla (G), and orangutan, 100-bp alignment blocks separated by ≥5-kb intervals were sampled and subjected to estimate τ = μT and θ = 4μgN using the Markov chain Monte Carlo method, where μ is the mutation rate and g is the generation time. Although the estimated τ(HC) differed across chromosomes, τ(HC) and τ(HCG) were strongly correlated across chromosomes, indicating that variation in τ is subject to variation in μ, rather than T, and thus, all chromosomes share a single speciation time. Subsequently, we estimated Ts of the human lineage from chimpanzee, gorilla, and orangutan to be 6.0-7.6, 7.6-9.7, and 15-19 Ma, respectively, assuming variable μ across lineages and chromosomes. These speciation times were consistent with the fossil records. We conclude that the speciation times in our recombination-free analysis would be conclusive and the speciation between human and chimpanzee was a single event.
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Cotler, Jordan; Wilczek, Frank
2016-12-01
We introduce quantum history states and their mathematical framework, thereby reinterpreting and extending the consistent histories approach to quantum theory. Through thought experiments, we demonstrate that our formalism allows us to analyze a quantum version of history in which we reconstruct the past by observations. In particular, we can pass from measurements to inferences about ‘what happened’ in a way that is sensible and free of paradox. Our framework allows for a richer understanding of the temporal structure of quantum theory, and we construct history states that embody peculiar, non-classical correlations in time.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network
Octavian
2015-01-01
Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...
Roy, Martin; Veillette, Jean; Daubois, Virginie
2014-05-01
The reconstruction of the history of former glacial lakes is commonly based on the study of strandlines that generally consist of boulder ridges, sandy beaches and other near-shore deposits. This approach, however, is limited in some regions where the surficial geology consists of thick accumulation of fine-grained glaciolacustrine sediments that mask most deglacial landforms. This situation is particularly relevant to the study of Lake Ojibway, a large proglacial lake that developed in northern Ontario and Quebec following the retreat of the southern Laurentide ice sheet margin during the last deglaciation. The history of Ojibway lake levels remains poorly known, mainly due to the fact that this lake occupied a deep and featureless basin that favored the sedimentation of thick sequences of rhythmites and prevented the formation of well-developed strandlines. Nonetheless, detailed mapping revealed a complex sequence of discontinuous small-scale cliffs that are scattered over the flat-lying Ojibway clay plain. These terrace-like features range in size from 4 to 7 m in height and can be followed for 10 to 100's of meters. These small-scale geomorphic features are interpreted to represent raised shorelines that were cut into glaciolacustrine sediments by lakeshore erosional processes (i.e., wave action). These so-called wave-cut scarps (WCS) occur at elevations ranging from 3 to 30 m above the present level of Lake Abitibi (267 m), one of the lowest landmarks in the area. Here we evaluate the feasibility of using this type of relict shorelines to constrain the evolution of Ojibway lake levels. For this purpose, a series of WCS were measured along four transects of about 40 km in length in the Lake Abitibi region. The absolute elevation of 154 WCS was determined with a Digital Video Plotter software package using 1:15K air-photos, coupled with precise measurements of control points, which were measured with a high-precision Global Navigation Satellite System tied up to
Model Diagnostics for Bayesian Networks
Sinharay, Sandip
2006-01-01
Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…
SOFOMORE: Combined EEG source and forward model reconstruction
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2009-01-01
We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including the representat......We propose a new EEG source localization method that simultaneously performs source and forward model reconstruction (SOFOMORE) in a hierarchical Bayesian framework. Reconstruction of the forward model is motivated by the many uncertainties involved in the forward model, including...... the representation of the cortical surface, conductivity distribution, and electrode positions. We demonstrate in both simulated and real EEG data that reconstruction of the forward model improves localization of the underlying sources....
Reconstructing Native American Population History
Reich, David; Patterson, Nick; Campbell, Desmond; Tandon, Arti; Mazieres, Stéphane; Ray, Nicolas; Parra, Maria V.; Rojas, Winston; Duque, Constanza; Mesa, Natalia; García, Luis F.; Triana, Omar; Blair, Silvia; Maestre, Amanda; Dib, Juan C.; Bravi, Claudio M.; Bailliet, Graciela; Corach, Daniel; Hünemeier, Tábita; Bortolini, Maria-Cátira; Salzano, Francisco M.; Petzl-Erler, María Luiza; Acuña-Alonzo, Victor; Aguilar-Salinas, Carlos; Canizales-Quinteros, Samuel; Tusié-Luna, Teresa; Riba, Laura; Rodríguez-Cruz, Maricela; Lopez-Alarcón, Mardia; Coral-Vazquez, Ramón; Canto-Cetina, Thelma; Silva-Zolezzi, Irma; Fernandez-Lopez, Juan Carlos; Contreras, Alejandra V.; Jimenez-Sanchez, Gerardo; Gómez-Vázquez, María José; Molina, Julio; Carracedo, Ángel; Salas, Antonio; Gallo, Carla; Poletti, Giovanni; Witonsky, David B.; Alkorta-Aranburu, Gorka; Sukernik, Rem I.; Osipova, Ludmila; Fedorova, Sardana; Vasquez, René; Villena, Mercedes; Moreau, Claudia; Barrantes, Ramiro; Pauls, David; Excoffier, Laurent; Bedoya, Gabriel; Rothhammer, Francisco; Dugoujon, Jean Michel; Larrouy, Georges; Klitz, William; Labuda, Damian; Kidd, Judith; Kidd, Kenneth; Rienzo, Anna Di; Freimer, Nelson B.; Price, Alkes L.; Ruiz-Linares, Andrés
2013-01-01
The peopling of the Americas has been the subject of extensive genetic, archaeological and linguistic research; however, central questions remain unresolved1–5. One contentious issue is whether the settlement occurred via a single6–8 or multiple streams of migration from Siberia9–15. The pattern of dispersals within the Americas is also poorly understood. To address these questions at higher resolution than was previously possible, we assembled data from 52 Native American and 17 Siberian groups genotyped at 364,470 single nucleotide polymorphisms. We show that Native Americans descend from at least three streams of Asian gene flow. Most descend entirely from a single ancestral population that we call “First American”. However, speakers of Eskimo-Aleut languages from the Arctic inherit almost half their ancestry from a second stream of Asian gene flow, and the Na-Dene-speaking Chipewyan from Canada inherit roughly one-tenth of their ancestry from a third stream. We show that the initial peopling followed a southward expansion facilitated by the coast, with sequential population splits and little gene flow after divergence, especially in South America. A major exception is in Chibchan-speakers on both sides of the Panama Isthmus, who have ancestry from both North and South America. PMID:22801491
Reconstructing Native American population history.
Reich, David; Patterson, Nick; Campbell, Desmond; Tandon, Arti; Mazieres, Stéphane; Ray, Nicolas; Parra, Maria V; Rojas, Winston; Duque, Constanza; Mesa, Natalia; García, Luis F; Triana, Omar; Blair, Silvia; Maestre, Amanda; Dib, Juan C; Bravi, Claudio M; Bailliet, Graciela; Corach, Daniel; Hünemeier, Tábita; Bortolini, Maria Cátira; Salzano, Francisco M; Petzl-Erler, María Luiza; Acuña-Alonzo, Victor; Aguilar-Salinas, Carlos; Canizales-Quinteros, Samuel; Tusié-Luna, Teresa; Riba, Laura; Rodríguez-Cruz, Maricela; Lopez-Alarcón, Mardia; Coral-Vazquez, Ramón; Canto-Cetina, Thelma; Silva-Zolezzi, Irma; Fernandez-Lopez, Juan Carlos; Contreras, Alejandra V; Jimenez-Sanchez, Gerardo; Gómez-Vázquez, Maria José; Molina, Julio; Carracedo, Angel; Salas, Antonio; Gallo, Carla; Poletti, Giovanni; Witonsky, David B; Alkorta-Aranburu, Gorka; Sukernik, Rem I; Osipova, Ludmila; Fedorova, Sardana A; Vasquez, René; Villena, Mercedes; Moreau, Claudia; Barrantes, Ramiro; Pauls, David; Excoffier, Laurent; Bedoya, Gabriel; Rothhammer, Francisco; Dugoujon, Jean-Michel; Larrouy, Georges; Klitz, William; Labuda, Damian; Kidd, Judith; Kidd, Kenneth; Di Rienzo, Anna; Freimer, Nelson B; Price, Alkes L; Ruiz-Linares, Andrés
2012-08-16
The peopling of the Americas has been the subject of extensive genetic, archaeological and linguistic research; however, central questions remain unresolved. One contentious issue is whether the settlement occurred by means of a single migration or multiple streams of migration from Siberia. The pattern of dispersals within the Americas is also poorly understood. To address these questions at a higher resolution than was previously possible, we assembled data from 52 Native American and 17 Siberian groups genotyped at 364,470 single nucleotide polymorphisms. Here we show that Native Americans descend from at least three streams of Asian gene flow. Most descend entirely from a single ancestral population that we call 'First American'. However, speakers of Eskimo-Aleut languages from the Arctic inherit almost half their ancestry from a second stream of Asian gene flow, and the Na-Dene-speaking Chipewyan from Canada inherit roughly one-tenth of their ancestry from a third stream. We show that the initial peopling followed a southward expansion facilitated by the coast, with sequential population splits and little gene flow after divergence, especially in South America. A major exception is in Chibchan speakers on both sides of the Panama isthmus, who have ancestry from both North and South America.
Directory of Open Access Journals (Sweden)
Leonid Hanin
2011-09-01
Full Text Available This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1 age and volume of the primary tumor at presentation; and (2 volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1 inception of the first metastasis occurred when the primary tumor was undetectable; (2 inception of all or most of the surveyed metastases occurred before the start of treatment; (3 the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4 surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer.
Hanin, Leonid; Zaider, Marco
2011-09-20
This article brings mathematical modeling to bear on the reconstruction of the natural history of prostate cancer and assessment of the effects of treatment on metastatic progression. We present a comprehensive, entirely mechanistic mathematical model of cancer progression accounting for primary tumor latency, shedding of metastases, their dormancy and growth at secondary sites. Parameters of the model were estimated from the following data collected from 12 prostate cancer patients: (1) age and volume of the primary tumor at presentation; and (2) volumes of detectable bone metastases surveyed at a later time. This allowed us to estimate, for each patient, the age at cancer onset and inception of the first metastasis, the expected metastasis latency time and the rates of growth of the primary tumor and metastases before and after the start of treatment. We found that for all patients: (1) inception of the first metastasis occurred when the primary tumor was undetectable; (2) inception of all or most of the surveyed metastases occurred before the start of treatment; (3) the rate of metastasis shedding is essentially constant in time regardless of the size of the primary tumor and so it is only marginally affected by treatment; and most importantly, (4) surgery, chemotherapy and possibly radiation bring about a dramatic increase (by dozens or hundred times for most patients) in the average rate of growth of metastases. Our analysis supports the notion of metastasis dormancy and the existence of prostate cancer stem cells. The model is applicable to all metastatic solid cancers, and our conclusions agree well with the results of a similar analysis based on a simpler model applied to a case of metastatic breast cancer.
Bayesian Lensing Shear Measurement
Bernstein, Gary M
2013-01-01
We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...
Fox, G.J.A.; Berg, van den S.M.; Veldkamp, B.P.; Irwing, P.; Booth, T.; Hughes, D.
2015-01-01
In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item resp
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Enhancing debris flow modeling parameters integrating Bayesian networks
Graf, C.; Stoffel, M.; Grêt-Regamey, A.
2009-04-01
Applied debris-flow modeling requires suitably constraint input parameter sets. Depending on the used model, there is a series of parameters to define before running the model. Normally, the data base describing the event, the initiation conditions, the flow behavior, the deposition process and mainly the potential range of possible debris flow events in a certain torrent is limited. There are only some scarce places in the world, where we fortunately can find valuable data sets describing event history of debris flow channels delivering information on spatial and temporal distribution of former flow paths and deposition zones. Tree-ring records in combination with detailed geomorphic mapping for instance provide such data sets over a long time span. Considering the significant loss potential associated with debris-flow disasters, it is crucial that decisions made in regard to hazard mitigation are based on a consistent assessment of the risks. This in turn necessitates a proper assessment of the uncertainties involved in the modeling of the debris-flow frequencies and intensities, the possible run out extent, as well as the estimations of the damage potential. In this study, we link a Bayesian network to a Geographic Information System in order to assess debris-flow risk. We identify the major sources of uncertainty and show the potential of Bayesian inference techniques to improve the debris-flow model. We model the flow paths and deposition zones of a highly active debris-flow channel in the Swiss Alps using the numerical 2-D model RAMMS. Because uncertainties in run-out areas cause large changes in risk estimations, we use the data of flow path and deposition zone information of reconstructed debris-flow events derived from dendrogeomorphological analysis covering more than 400 years to update the input parameters of the RAMMS model. The probabilistic model, which consistently incorporates this available information, can serve as a basis for spatial risk
The NIFTY way of Bayesian signal inference
Energy Technology Data Exchange (ETDEWEB)
Selig, Marco, E-mail: mselig@mpa-Garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany, and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)
2014-12-05
We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D{sup 3}PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.
Bayesian Face Sketch Synthesis.
Wang, Nannan; Gao, Xinbo; Sun, Leiyu; Li, Jie
2017-03-01
Exemplar-based face sketch synthesis has been widely applied to both digital entertainment and law enforcement. In this paper, we propose a Bayesian framework for face sketch synthesis, which provides a systematic interpretation for understanding the common properties and intrinsic difference in different methods from the perspective of probabilistic graphical models. The proposed Bayesian framework consists of two parts: the neighbor selection model and the weight computation model. Within the proposed framework, we further propose a Bayesian face sketch synthesis method. The essential rationale behind the proposed Bayesian method is that we take the spatial neighboring constraint between adjacent image patches into consideration for both aforementioned models, while the state-of-the-art methods neglect the constraint either in the neighbor selection model or in the weight computation model. Extensive experiments on the Chinese University of Hong Kong face sketch database demonstrate that the proposed Bayesian method could achieve superior performance compared with the state-of-the-art methods in terms of both subjective perceptions and objective evaluations.
St John, K. K.; Jones, M. H.; Leckie, R. M.; Pound, K. S.; Krissek, L. A.
2013-12-01
develop detailed instructor guides to accompany each module. After careful consideration of dissemination options, we choose to publish the full suite of exercise modules as a commercially-available book, Reconstructing Earth's Climate History, while also providing open online access to a subset of modules. Its current use in undergraduate paleoclimatology courses, and the availability of select modules for use in other courses demonstrate that creative, hybrid options can be found for lasting dissemination, and thus sustainability. In achieving our goal of making science accessible, we believe we have followed a curriculum development process and sustainability path that can be used by others to meet needs in earth, ocean, and atmospheric science education. Next steps for REaCH include exploration of its use in blended learning classrooms, and at minority serving institutions.
Daubois, V.; Roy, M.; Veillette, J. J.
2012-12-01
different phases in the Lake Abitibi region, at elevations of 290 m, 297 m, and 313 m. For comparison, the near-maximum phase of Lake Ojibway lies at 460 m, about 250 km to the NE of the study area. Overall, the elevation and position of these wave-cut terraces suggest they were formed during episodes of long stands associated with late-stage phases of glacial Lake Ojibway. An additional lake level is indicated by the lowest set WCBs that lie about 6 m above modern Lake Abitibi. These terraces are also restricted to area surrounding this lake, likely reflecting the occurrence of a paleolake Abitibi. These preliminary results thus underlie the strong potential of using these lakeshore features to reconstruct former lake level history. However, the data gathered so far do not allow firm conclusions on the number, exact elevation, and regional extent of the lake-level phases documented. Additional data are required at the scale of the area submerged by Lake Ojibway. The continuation of this work should also provide constraints on the origin of these lake-level phases, thereby potentially reinforcing our understanding of the role of meltwater discharges in climate fluctuations that marked the early Holocene.
Palacios, Julia A; Minin, Vladimir N
2013-03-01
Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method.
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Hybrid Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2012-01-01
Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Center, Julian L.; Knuth, Kevin H.
2011-03-01
Visual odometry refers to tracking the motion of a body using an onboard vision system. Practical visual odometry systems combine the complementary accuracy characteristics of vision and inertial measurement units. The Mars Exploration Rovers, Spirit and Opportunity, used this type of visual odometry. The visual odometry algorithms in Spirit and Opportunity were based on Bayesian methods, but a number of simplifying approximations were needed to deal with onboard computer limitations. Furthermore, the allowable motion of the rover had to be severely limited so that computations could keep up. Recent advances in computer technology make it feasible to implement a fully Bayesian approach to visual odometry. This approach combines dense stereo vision, dense optical flow, and inertial measurements. As with all true Bayesian methods, it also determines error bars for all estimates. This approach also offers the possibility of using Micro-Electro Mechanical Systems (MEMS) inertial components, which are more economical, weigh less, and consume less power than conventional inertial components.
Bayesian approach to inverse statistical mechanics.
Habeck, Michael
2014-05-01
Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.
Probabilistic Inferences in Bayesian Networks
Ding, Jianguo
2010-01-01
This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian methods for hackers probabilistic programming and Bayesian inference
Davidson-Pilon, Cameron
2016-01-01
Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...
Institute of Scientific and Technical Information of China (English)
郑润良
2011-01-01
Studies on the history of contemporary literature have,since the 1990s,successively witnessed distinction in the concepts of the enlightenment literary history,of the Neo-Leftist literary history and of relationalism literary history,with a specific emphasis on the reconstruction significance of the relationalism literature history concept for contemporary literature and cultural landscape so as to gain a holistic view and grasp of the ＂rewriting of literary history＂ since the 1990s.%90年代以来当代文学史研究中相继出现了启蒙文学史观、新左派文学史观与关系主义文学史观的分野,尤其强调关系主义文学史观对当代文学、文化图景的重构意义,以期对90年代以来＂重写文学史＂活动有一个整体的观照与把握。
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...
Sparse Bayesian learning in ISAR tomography imaging
Institute of Scientific and Technical Information of China (English)
SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang
2015-01-01
Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.
A Bayesian method for pulsar template generation
Imgrund, M; Kramer, M; Lesch, H
2015-01-01
Extracting Times of Arrival from pulsar radio signals depends on the knowledge of the pulsars pulse profile and how this template is generated. We examine pulsar template generation with Bayesian methods. We will contrast the classical generation mechanism of averaging intensity profiles with a new approach based on Bayesian inference. We introduce the Bayesian measurement model imposed and derive the algorithm to reconstruct a "statistical template" out of noisy data. The properties of these "statistical templates" are analysed with simulated and real measurement data from PSR B1133+16. We explain how to put this new form of template to use in analysing secondary parameters of interest and give various examples: We implement a nonlinear filter for determining ToAs of pulsars. Applying this method to data from PSR J1713+0747 we derive ToAs self consistently, meaning all epochs were timed and we used the same epochs for template generation. While the average template contains fluctuations and noise as unavoida...
Bayesian network modelling of upper gastrointestinal bleeding
Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri
2013-09-01
Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.
... this page: //medlineplus.gov/ency/article/007208.htm ACL reconstruction To use the sharing features on this page, please enable JavaScript. ACL reconstruction is surgery to reconstruct the ligament in ...
Institute of Scientific and Technical Information of China (English)
路利军; 马建华; 黄静; 毕一鸣; 刘楠; 陈武凡
2011-01-01
将配准的解剖图像作为先验信息指导PET图像重建已有广泛的研究.基于非局部均值(nonlocal means)滤波和解剖图像的区域信息,提出一种解剖自适应的非局部先验(anatomically adaptive nonlocal prior,AANLP)模型.新模型中的信息来自一个较大的非局部邻域内灰度值的加权差,其权值通过计算两个像素的相似性获得.权值参数通过利用解剖图像的区域信息进行自适应迭代估计.在PET图像的重建过程中,AANLP模型自适应地用于每一个解剖区域.构建两步式重建策略,用于图像重建和参数估计.仿真数据重建结果表明,AANLP具有很好的保持边缘效果,并且能鲁棒地产生最高的病灶对比度.%The incorporation of registered anatomical image as a prior to guide PET image reconstruction has been reported in many previous studies. Based on the nonlocal means filter and the regional information of anatomical image, an anatomically adaptive nonlocal prior (AANLP) is proposed. The information in this prior model comes from weighted differences between pixel intensities with in a nonlocal neighborhood. The weights of each pixel depend on its similarity with respect to the other pixels. The regional information of anatomical image is used to estimate a smoothing parameter which controls the decay of similarity function. This prior is determined and applied adaptively to each anatomical region on the PET image for the iteration in the reconstruction process. A two-step reconstruction scheme using the AANLP is proposed to update the image and estimate the parameter. The simulation results show that the AANLP reconstruction can dramatically preserve the edges and yield overall higher lesion-to-background contrast.
Rodriguez, Maria A.; Niaz, Mansoor
2004-01-01
Recent research in science education has recognized the importance of history and philosophy of science. The objective of this study is to evaluate the presentation of the Thomson, Rutherford, and Bohr models of the atom in general physics textbooks based on criteria derived from history and philosophy of science. Forty-one general physics…
Bayesian analysis of cosmic structures
Kitaura, Francisco-Shu
2011-01-01
We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...
Modern methods of image reconstruction.
Puetter, R. C.
The author reviews the image restoration or reconstruction problem in its general setting. He first discusses linear methods for solving the problem of image deconvolution, i.e. the case in which the data are a convolution of a point-spread function and an underlying unblurred image. Next, non-linear methods are introduced in the context of Bayesian estimation, including maximum likelihood and maximum entropy methods. Then, the author discusses the role of language and information theory concepts for data compression and solving the inverse problem. The concept of algorithmic information content (AIC) is introduced and is shown to be crucial to achieving optimal data compression and optimized Bayesian priors for image reconstruction. The dependence of the AIC on the selection of language then suggests how efficient coordinate systems for the inverse problem may be selected. The author also introduced pixon-based image restoration and reconstruction methods. The relation between image AIC and the Bayesian incarnation of Occam's Razor is discussed, as well as the relation of multiresolution pixon languages and image fractal dimension. Also discussed is the relation of pixons to the role played by the Heisenberg uncertainty principle in statistical physics and how pixon-based image reconstruction provides a natural extension to the Akaike information criterion for maximum likelihood. The author presents practical applications of pixon-based Bayesian estimation to the restoration of astronomical images. He discusses the effects of noise, effects of finite sampling on resolution, and special problems associated with spatially correlated noise introduced by mosaicing. Comparisons to other methods demonstrate the significant improvements afforded by pixon-based methods and illustrate the science that such performance improvements allow.
Reconstructing ancestral ranges in historical biogeography: properties and prospects
Institute of Scientific and Technical Information of China (English)
Kristin S. LAMM; Benjamin D. REDELINGS
2009-01-01
Recent years have witnessed a proliferation of quantitative methods for biogeographic inference. In particular, novel parametric approaches represent exciting new opportunities for the study of range evolution. Here, we review a selection of current methods for biogeographic analysis and discuss their respective properties. These methods include generalized parsimony approaches, weighted ancestral area analysis, dispersal-vicariance analysis, the dispersal-extinction-cladogenesis model and other maximum likelihood approaches, and Bayesian stochastic mapping of ancestral ranges, including a novel approach to inferring range evolution in the context of island biogeography. Some of these methods were developed specifically for problems of ancestral range reconstruction, whereas others were designed for more general problems of character state reconstruction and subsequently applied to the study of ancestral ranges. Methods for reconstructing ancestral history on a phylogenetic tree differ not only in the types of ancestral range states that are allowed, but also in the various historical events that may change the ancestral ranges. We explore how the form of allowed ancestral ranges and allowed transitions can both affect the outcome of ancestral range estimation. Finally, we mention some promising avenues for future work in the development of model-based approaches to biogeographic analysis.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel N
2012-01-01
Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Directory of Open Access Journals (Sweden)
Stefano Benazzi
2007-07-01
Full Text Available The work consists of the reconstruction of the face of the great poet called Dante Alighieri through a multidisciplinary approach that matches traditional techniques (manual ones, usually used in forensic anthropology, with digital methodologies that take advantage of technologies born in manufacturer-military fields but that are more and more often applied to the field of the cultural heritage. Unable to get the original skull of Dante, the work started from the data and the elements collected by Fabio Frassetto and Giuseppe Sergi, two important anthropologists, respectively at the University of Bologna and Rome, in an investigation carried out in 1921, sixth century anniversary of his death, on the remains of the poet collected in Ravenna. Thanks to this, we have a very accurate description of Dante’s bones, including 297 metric data inherent to the whole skeleton, some photographs in the scale of the skull, the various norms and many other bones, as well as a model of the skull subsequently realized by Frassetto. According to these information, a geometric reconstruction of Dante Alighieri skull including the jaw was carried out through the employment and integration of the instruments and technologies of the virtual reality, and from this the relative physical model through fast prototype was realized. An important aspect of the work regards in a particular manner the methodology of 3D modelling proposed for the new reconstruction of the jaw (not found in the course of the 1921 recognition, starting from a reference model. The model of the skull prototype is then useful as the basis for the successive stage of facial reconstruction through the traditional techniques of forensic art.
Titus, Benjamin M.; Daly, Marymegan
2017-03-01
Specialist and generalist life histories are expected to result in contrasting levels of genetic diversity at the population level, and symbioses are expected to lead to patterns that reflect a shared biogeographic history and co-diversification. We test these assumptions using mtDNA sequencing and a comparative phylogeographic approach for six co-occurring crustacean species that are symbiotic with sea anemones on western Atlantic coral reefs, yet vary in their host specificities: four are host specialists and two are host generalists. We first conducted species discovery analyses to delimit cryptic lineages, followed by classic population genetic diversity analyses for each delimited taxon, and then reconstructed the demographic history for each taxon using traditional summary statistics, Bayesian skyline plots, and approximate Bayesian computation to test for signatures of recent and concerted population expansion. The genetic diversity values recovered here contravene the expectations of the specialist-generalist variation hypothesis and classic population genetics theory; all specialist lineages had greater genetic diversity than generalists. Demography suggests recent population expansions in all taxa, although Bayesian skyline plots and approximate Bayesian computation suggest the timing and magnitude of these events were idiosyncratic. These results do not meet the a priori expectation of concordance among symbiotic taxa and suggest that intrinsic aspects of species biology may contribute more to phylogeographic history than extrinsic forces that shape whole communities. The recovery of two cryptic specialist lineages adds an additional layer of biodiversity to this symbiosis and contributes to an emerging pattern of cryptic speciation in the specialist taxa. Our results underscore the differences in the evolutionary processes acting on marine systems from the terrestrial processes that often drive theory. Finally, we continue to highlight the Florida Reef
Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges
2013-04-01
Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to
DEFF Research Database (Denmark)
Valdazo-González, Begoña; Polihronova, Lilyana; Alexandrov, Tsviatko
2012-01-01
the origin and transmission history of the FMD outbreaks which occurred during 2011 in Burgas Province, Bulgaria, a country that had been previously FMD-free-without-vaccination since 1996. Nineteen full genome sequences (FGS) of FMD virus (FMDV) were generated and analysed, including eight representative...
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Novenko, Elena Yu.; Tsyganov, Andrey N.; Volkova, Elena M.; Babeshko, Kirill V.; Lavrentiev, Nikita V.; Payne, Richard J.; Mazei, Yuri A.
2015-05-01
Holocene climatic variability and human impact on vegetation are reconstructed from a region in central European Russia, which lies at an important ecotone between broadleaf forest and steppe. For the first time in this region we adopt a multi-proxy approach that combines analysis of local mire conditions from plant macrofossil and testate amoeba analyses with pollen-based quantitative climate reconstruction. The proxies indicate a long-term warming trend from 9700 to 7500 cal yr BP, interrupted by a series of short-term cold events. From 7500 to 5000 cal yr BP the results imply a relatively stable climate, warmer and drier than present, spanning the Holocene Thermal Maximum. Since 5000 cal yr BP the data suggest a change to cooler climate, but with centennial-scale variability. This shift at around 5000 cal yr BP is supported by extensive evidence from other sites. In the early Holocene, the region was occupied mainly by pine and birch forests. Broad-leafed forests of oak, lime and elm expanded after 7800 cal yr BP and remained dominant until the last few centuries. During the historical period, vegetation changes have been driven mainly by human activities.
Classification using Bayesian neural nets
J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)
1995-01-01
textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to neura
Bayesian Intersubjectivity and Quantum Theory
Pérez-Suárez, Marcos; Santos, David J.
2005-02-01
Two of the major approaches to probability, namely, frequentism and (subjectivistic) Bayesian theory, are discussed, together with the replacement of frequentist objectivity for Bayesian intersubjectivity. This discussion is then expanded to Quantum Theory, as quantum states and operations can be seen as structural elements of a subjective nature.
Bayesian Approach for Inconsistent Information.
Stein, M; Beer, M; Kreinovich, V
2013-10-01
In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach - e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression - which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster - because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods.
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael
2009-01-01
Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Approximate Bayesian computation.
Directory of Open Access Journals (Sweden)
Mikael Sunnåker
Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.
Nahill, N. D.; Giegengack, R.; Lande, K.; Omar, G.
2008-12-01
We plan to measure the inventory of cosmogenically produced 22Ne atoms preserved in the mineral lattice of halite in deposits of rock salt, and to use that inventory to measure variations in the cosmic-ray flux to enable us to reconstruct the history of supernovae. Bedded rock salt consists almost entirely of the mineral halite (NaCl). Any neon trapped in the halite crystals during precipitation is primarily 20Ne, with a 22Ne concentration of 9% or less. Any neon resulting from cosmic-ray interactions with 23Na is solely 22Ne; therefore, 22Ne atoms in excess of 9% of the total neon are cosmogenic in origin. Measurement of the 22Ne inventory in halite from deposits covering a range of geologic ages may enable us to document the systematic growth of 22Ne through geologic time and, thus, establish the cosmic-ray flux and a chronology of supernovae. The cosmic-ray flux is attenuated in direct proportion to the mass of material overlying a halite deposit. To adjust the 22Ne inventory to account for that attenuation, we must reconstruct the post-depositional history of accumulation and removal of superjacent sediment for each halite deposit we study. As an example of our procedure, we reconstruct here the shielding history of the Permian halite deposit, the Salado Formation, Delaware Basin, New Mexico. The stratigraphy of the Delaware Basin has been well documented via exploration and production wells drilled in search of oil and gas, exploration boreholes associated with potash mining, and comprehensive geologic site assessment of the DOE Waste Isolation Pilot Plant (WIPP). WIPP is a subsurface repository for the permanent disposal of transuranic wastes, located in southeastern New Mexico, 42 km east of Carlsbad and approximately 655 m beneath the surface in the Salado Fm. The Salado Fm is part of the Late Permian Ochoan Series, and consists of 1) a lower member, 2) the McNutt Potash Zone, and 3) an upper member. WIPP lies between marker bed (MB)139 and MB136 in the
Kataoka, K.; Nagahashi, Y.; Yoshikawa, S.
2001-06-01
An extremely large magnitude eruption of the Ebisutoge-Fukuda tephra, close to the Plio-Pleistocene boundary, central Japan, spread volcanic materials widely more than 290,000 km 2 reaching more than 300 km from the probable source. Characteristics of the distal air-fall ash (>150 km away from the vent) and proximal pyroclastic deposits are clarified to constrain the eruptive style, history, and magnitude of the Ebisutoge-Fukuda eruption. Eruptive history had five phases. Phase 1 is phreatoplinian eruption producing >105 km 3 of volcanic materials. Phases 2 and 3 are plinian eruption and transition to pyroclastic flow. Plinian activity also occurred in phase 4, which ejected conspicuous obsidian fragments to the distal locations. In phase 5, collapse of eruption column triggered by phase 4, generated large pyroclastic flow in all directions and resulted in more than 250-350 km 3 of deposits. Thus, the total volume of this tephra amounts over 380-490 km 3. This indicates that the Volcanic Explosivity Index (VEI) of the Ebisutoge-Fukuda tephra is greater than 7. The huge thickness of reworked volcaniclastic deposits overlying the fall units also attests to the tremendous volume of eruptive materials of this tephra. Numerous ancient tephra layers with large volume have been reported worldwide, but sources and eruptive history are often unknown and difficult to determine. Comparison of distal air-fall ashes with proximal pyroclastic deposits revealed eruption style, history and magnitude of the Ebisutoge-Fukuda tephra. Hence, recognition of the Ebisutoge-Fukuda tephra, is useful for understanding the volcanic activity during the Pliocene to Pleistocene, is important as a boundary marker bed, and can be used to interpret the global environmental and climatic impact of large magnitude eruptions in the past.
DEFF Research Database (Denmark)
Poulsen, Bo
2012-01-01
This essay provides an overview of recent trends in the historiography of marine environmental history, a sub-field of environmental history which has grown tremendously in scope and size over the last c. 15 years. The object of marine environmental history is the changing relationship between...... human society and natural marine resources. Within this broad topic, several trends and objectives are discernable. The essay argue that the so-called material marine environmental history has its main focus on trying to reconstruct the presence, development and environmental impact of past fisheries...... and whaling operations. This ambition often entails a reconstruction also of how marine life has changed over time. The time frame rages from Paleolithicum to the present era. The field of marine environmental history also includes a more culturally oriented environmental history, which mainly has come...
Borsboom, D.; Haig, B.D.
2013-01-01
Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar
Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions
Directory of Open Access Journals (Sweden)
Richard M. Todd
1988-03-01
Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.
Evolutionary history of vegetative reproduction in Porpidia s.L. (Lichen-forming ascomycota).
Buschbom, Jutta; Barker, Daniel
2006-06-01
The evolutionary history of gains and losses of vegetative reproductive propagules (soredia) in Porpidia s.l., a group of lichen-forming ascomycetes, was clarified using Bayesian Markov chain Monte Carlo (MCMC) approaches to monophyly tests and a combined MCMC and maximum likelihood approach to ancestral character state reconstructions. The MCMC framework provided confidence estimates for the reconstructions of relationships and ancestral character states, which formed the basis for tests of evolutionary hypotheses. Monophyly tests rejected all hypotheses that predicted any clustering of reproductive modes in extant taxa. In addition, a nearest-neighbor statistic could not reject the hypothesis that the vegetative reproductive mode is randomly distributed throughout the group. These results show that transitions between presence and absence of the vegetative reproductive mode within Porpidia s.l. occurred several times and independently of each other. Likelihood reconstructions of ancestral character states at selected nodes suggest that--contrary to previous thought--the ancestor to Porpidia s.l. already possessed the vegetative reproductive mode. Furthermore, transition rates are reconstructed asymmetrically with the vegetative reproductive mode being gained at a much lower rate than it is lost. A cautious note has to be added, because a simulation study showed that the ancestral character state reconstructions were highly dependent on taxon sampling. However, our central conclusions, particularly the higher rate of change from vegetative reproductive mode present to absent than vice versa within Porpidia s.l., were found to be broadly independent of taxon sampling.
Insights on the Bayesian spectral density method for operational modal analysis
Au, Siu-Kui
2016-01-01
This paper presents a study on the Bayesian spectral density method for operational modal analysis. The method makes Bayesian inference of the modal properties by using the sample power spectral density (PSD) matrix averaged over independent sets of ambient data. In the typical case with a single set of data, it is divided into non-overlapping segments and they are assumed to be independent. This study is motivated by a recent paper that reveals a mathematical equivalence of the method with the Bayesian FFT method. The latter does not require averaging concepts or the independent segment assumption. This study shows that the equivalence does not hold in reality because the theoretical long data asymptotic distribution of the PSD matrix may not be valid. A single time history can be considered long for the Bayesian FFT method but not necessarily for the Bayesian PSD method, depending on the number of segments.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
Ortega, Pedro A
2011-01-01
Discovering causal relationships is a hard task, often hindered by the need for intervention, and often requiring large amounts of data to resolve statistical uncertainty. However, humans quickly arrive at useful causal relationships. One possible reason is that humans use strong prior knowledge; and rather than encoding hard causal relationships, they encode beliefs over causal structures, allowing for sound generalization from the observations they obtain from directly acting in the world. In this work we propose a Bayesian approach to causal induction which allows modeling beliefs over multiple causal hypotheses and predicting the behavior of the world under causal interventions. We then illustrate how this method extracts causal information from data containing interventions and observations.
Blundell, Charles; Heller, Katherine A
2012-01-01
Hierarchical structure is ubiquitous in data across many domains. There are many hier- archical clustering methods, frequently used by domain experts, which strive to discover this structure. However, most of these meth- ods limit discoverable hierarchies to those with binary branching structure. This lim- itation, while computationally convenient, is often undesirable. In this paper we ex- plore a Bayesian hierarchical clustering algo- rithm that can produce trees with arbitrary branching structure at each node, known as rose trees. We interpret these trees as mixtures over partitions of a data set, and use a computationally efficient, greedy ag- glomerative algorithm to find the rose trees which have high marginal likelihood given the data. Lastly, we perform experiments which demonstrate that rose trees are better models of data than the typical binary trees returned by other hierarchical clustering algorithms.
Bayesian inference in geomagnetism
Backus, George E.
1988-01-01
The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.
Energy Technology Data Exchange (ETDEWEB)
Arakawa, Jumpei; Sakamoto, Wataru [Division of Applied Biosciences, Graduate School of Agriculture, Kyoto Univ., Kyoto (Japan)
1998-07-01
Strontium (Sr) concentration in the shells of short-necked clams collected at different locations (Shirahama, warm area and Maizuru, cold area, Japan) was analyzed by two methods, PIXE and EPMA. The Sr concentration of external surface of shell umbo, which was made during short term at early benthic phase, was analyzed by PIXE, and was ranged from 1000 to 3500 ppm for individuals. The Sr concentration of clams collected at Shirahama showed positive correlation with shell length (SL) in individuals with SL < 31 mm, whereas clams collected at Maizuru did not show significant correlation. This result may be caused from the difference of the spawning seasons between two areas. The Sr concentration of cross section of shell umbo, which develops thicker continuously during their life to form faint stratum structure, was analyzed by EPMA along the line across the stratum structure. Some surges and long term waving patterns of the Sr concentration were observed. These results suggest that the life histories of individual clams could be recorded in the shell umbo cross sections as variations of trace elements and analyses of trace elements could clarify the histories of individual clams. (author)
Bayesian Calibration of Microsimulation Models.
Rutter, Carolyn M; Miglioretti, Diana L; Savarino, James E
2009-12-01
Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models.
Inherently irrational? A computational model of escalation of commitment as Bayesian Updating.
Gilroy, Shawn P; Hantula, Donald A
2016-06-01
Monte Carlo simulations were performed to analyze the degree to which two-, three- and four-step learning histories of losses and gains correlated with escalation and persistence in extended extinction (continuous loss) conditions. Simulated learning histories were randomly generated at varying lengths and compositions and warranted probabilities were determined using Bayesian Updating methods. Bayesian Updating predicted instances where particular learning sequences were more likely to engender escalation and persistence under extinction conditions. All simulations revealed greater rates of escalation and persistence in the presence of heterogeneous (e.g., both Wins and Losses) lag sequences, with substantially increased rates of escalation when lags comprised predominantly of losses were followed by wins. These methods were then applied to human investment choices in earlier experiments. The Bayesian Updating models corresponded with data obtained from these experiments. These findings suggest that Bayesian Updating can be utilized as a model for understanding how and when individual commitment may escalate and persist despite continued failures.
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Cosmic expansion history from SNe Ia data via information field theory: the charm code
Porqueres, Natàlia; Enßlin, Torsten A.; Greiner, Maksim; Böhm, Vanessa; Dorn, Sebastian; Ruiz-Lapuente, Pilar; Manrique, Alberto
2017-03-01
We present charm (cosmic history agnostic reconstruction method), a novel inference algorithm that reconstructs the cosmic expansion history as encoded in the Hubble parameter H(z) from SNe Ia data. The novelty of the approach lies in the usage of information field theory, a statistical field theory that is very well suited for the construction of optimal signal recovery algorithms. The charm algorithm infers non-parametrically s(a) = ln(ρ(a) /ρcrit0), the density evolution which determines H(z), without assuming an analytical form of ρ(a) but only its smoothness with the scale factor a = (1 + z)-1. The inference problem of recovering the signal s(a) from the data is formulated in a fully Bayesian way. In detail, we have rewritten the signal as the sum of a background cosmology and a perturbation. This allows us to determine the maximum a posteriory estimate of the signal by an iterative Wiener filter method. Applying charm to the Union2.1 supernova compilation, we have recovered a cosmic expansion history that is fully compatible with the standard ΛCDM cosmological expansion history with parameter values consistent with the results of the Planck mission.
Irregular-Time Bayesian Networks
Ramati, Michael
2012-01-01
In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...
Słowiński, Michał; Tyszkowski, Sebastian; Ott, Florian; Obremska, Milena; Kaczmarek, Halina; Theuerkauf, Martin; Wulf, Sabine; Brauer, Achim
2016-04-01
The aim of the study was to reconstruct human and landscape development in the Tuchola Pinewoods (Northern Poland) during the last 800 years. We apply an approach that combines historic maps and documents with pollen data. Pollen data were obtained from varved lake sediments at a resolution of 5 years. The chronology of the sediment record is based on varve counting, AMS 14C dating, 137Cs activity concentration measurements and tephrochronology (Askja AD 1875). We applied the REVEALS model to translate pollen percentage data into regional plant abundances. The interpretation of the pollen record is furthermore based on pollen accumulation rate data. The pollen record and historic documents show similar trends in vegetation development. During the first phase (AD 1200-1412), the Lake Czechowskie area was still largely forested with Quercus, Carpinus and Pinus forests. Vegetation was more open during the second phase (AD 1412-1776), and reached maximum openness during the third phase (AD 1776-1905). Furthermore, intensified forest management led to a transformation from mixed to pine dominated forests during this period. Since the early 20th century, the forest cover increased again with dominance of the Scots pine in the stand. While pollen and historic data show similar trends, they differ substantially in the degree of openness during the four phases with pollen data commonly suggesting more open conditions. We discuss potential causes for this discrepancy, which include unsuitable parameters settings in REVEALS and unknown changes in forest structure. Using pollen accumulation data as a third proxy record we aim to identify the most probable causes. Finally, we discuss the observed vegetation change in relation the socio-economic development of the area. This study is a contribution to the Virtual Institute of Integrated Climate and Landscape Evolution Analysis - ICLEA- of the Helmholtz Association and National Science Centre, Poland (grant No. 2011/01/B/ST10
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Bayesian Methods for Statistical Analysis
Puza, Borek
2015-01-01
Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...
Huang, Chao; Oraevsky, Alexander A.; Anastasio, Mark A.
2010-08-01
Optoacoustic tomography (OAT) is an emerging ultrasound-mediated biophotonic imaging modality that has exciting potential for many biomedical imaging applications. There is great interest in conducting B-mode ultrasound and OAT imaging studies for breast cancer detection using a common transducer. In this situation, the range of tomographic view angles is limited, which can result in distortions in the reconstructed OAT image if conventional reconstruction algorithms are applied to limited-view measurement data. In this work, we investigate an image reconstruction method that utilizes information regarding target boundaries to improve the quality of the reconstructed OAT images. This is accomplished by developing boundary-constrained image reconstruction algorithm for OAT based on Bayesian image reconstruction theory. The computer-simulation studies demonstrate that the Bayesian approach can effectively reduce the artifact and noise levels and preserve the edges in reconstructed limited-view OAT images as compared to those produced by a conventional OAT reconstruction algorithm.
Tian, Hanqin; Banger, Kamaljit; Bo, Tao; Dadhwal, Vinay K.
2014-10-01
In India, human population has increased six-fold from 200 million to 1200 million that coupled with economic growth has resulted in significant land use and land cover (LULC) changes during 1880-2010. However, large discrepancies in the existing LULC datasets have hindered our efforts to better understand interactions among human activities, climate systems, and ecosystem in India. In this study, we incorporated high-resolution remote sensing datasets from Resourcesat-1 and historical archives at district (N = 590) and state (N = 30) levels to generate LULC datasets at 5 arc minute resolution during 1880-2010 in India. Results have shown that a significant loss of forests (from 89 million ha to 63 million ha) has occurred during the study period. Interestingly, the deforestation rate was relatively greater under the British rule (1880-1950s) and early decades after independence, and then decreased after the 1980s due to government policies to protect the forests. In contrast to forests, cropland area has increased from 92 million ha to 140.1 million ha during 1880-2010. Greater cropland expansion has occurred during the 1950-1980s that coincided with the period of farm mechanization, electrification, and introduction of high yielding crop varieties as a result of government policies to achieve self-sufficiency in food production. The rate of urbanization was slower during 1880-1940 but significantly increased after the 1950s probably due to rapid increase in population and economic growth in India. Our study provides the most reliable estimations of historical LULC at regional scale in India. This is the first attempt to incorporate newly developed high-resolution remote sensing datasets and inventory archives to reconstruct the time series of LULC records for such a long period in India. The spatial and temporal information on LULC derived from this study could be used by ecosystem, hydrological, and climate modeling as well as by policy makers for assessing the
Rodríguez-Martínez, Marta; Reitner, Joachim
2015-12-01
In the Lutetian intraslope Ainsa sub-basin, small, sub-spherical, carbonate mud mounds occur associated with hemipelagic marls and mixed gravity flow deposits. The studied mud mounds consist of a mixture of allochthonous, parautochthonous and autochthonous components that show evidences of reworking, bioerosion, and accretion by different fossil assemblages at different growth stages. The crusts of microbial-lithistid sponges played an important role stabilizing the rubble of coralgal-coralline sponges and formed low-relief small benthic patches in a dominant marly soft slope environment. These accidental hard substrates turned into suitable initiation/nucleation sites for automicrite production (dense and peloidal automicrites) on which the small mud mounds dominated by opportunistic epi- and infaunal heterozoan assemblages grew. A detailed microfacies mapping and paleoenvironmental analysis reveals a multi-episodic downslope accretion history starred by demosponges (coralline and lithistid sponges), agariciid corals, calcareous red algae, putative microbial benthic communities and diverse sclerobionts from the upper slope to the middle slope. The analyzed mud mound microfacies are compared with similar fossil assemblages and growth fabrics described in many fossil mud mounds, and with recent deep-water fore reefs and cave environments.
中国心理学思想史范畴体系的重建%Reconstruction of Category's System of History of Chinese Psychological Thoughts
Institute of Scientific and Technical Information of China (English)
彭彦琴
2001-01-01
中国心理学思想史理论层次的进一步提升呼唤着新的范畴体系的建构。本文分析了已有范畴说存在的不足，提出了范畴建构的几条原则，最后详尽解说了在天人合一背景中的以人性为元范畴所构建的中国心理学思想史范畴体系，它向人们展示了其内在的逻辑脉络，精深内蕴及鲜明个性。%The elevation of theory of history of chinese psychological thoughts calls the construction of new category's system. This paper analyses the short coming of old category' s construction, explaining the category's system of Chinese psychological thoughts that is based on human.n nature - metacategory whose background is oneness heaven and man, that displays its internal logical vein. profound implication and distinct character.
Research of Gene Regulatory Network with Multi-Time Delay Based on Bayesian Network
Institute of Scientific and Technical Information of China (English)
LIU Bei; MENG Fanjiang; LI Yong; LIU Liyan
2008-01-01
The gene regulatory network was reconstructed according to time-series microarray data getting from hybridization at different time between gene chips to analyze coordination and restriction between genes. An algorithm for controlling the gene expression regulatory network of the whole cell was designed using Bayesian network which provides an effective aided analysis for gene regulatory network.
Reconstructing the Tengger calendar
Directory of Open Access Journals (Sweden)
Ian Proudfoot
2008-12-01
Full Text Available The survival of an Indic calendar among the Tengger people of the Brama highlands in east Java opens a window on Java’s calendar history. Its hybrid form reflects accommodations between this non-Muslim Javanese group and the increasingly dominant Muslim Javanese culture. Reconstruction is challenging because of this hybridity, because of inconsistencies in practice, and because the historical evidence is sketchy and often difficult to interpret.
Sparse Bayesian learning for DOA estimation with mutual coupling.
Dai, Jisheng; Hu, Nan; Xu, Weichao; Chang, Chunqi
2015-10-16
Sparse Bayesian learning (SBL) has given renewed interest to the problem of direction-of-arrival (DOA) estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs). Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM) algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD) to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.
Sparse Bayesian Learning for DOA Estimation with Mutual Coupling
Directory of Open Access Journals (Sweden)
Jisheng Dai
2015-10-01
Full Text Available Sparse Bayesian learning (SBL has given renewed interest to the problem of direction-of-arrival (DOA estimation. It is generally assumed that the measurement matrix in SBL is precisely known. Unfortunately, this assumption may be invalid in practice due to the imperfect manifold caused by unknown or misspecified mutual coupling. This paper describes a modified SBL method for joint estimation of DOAs and mutual coupling coefficients with uniform linear arrays (ULAs. Unlike the existing method that only uses stationary priors, our new approach utilizes a hierarchical form of the Student t prior to enforce the sparsity of the unknown signal more heavily. We also provide a distinct Bayesian inference for the expectation-maximization (EM algorithm, which can update the mutual coupling coefficients more efficiently. Another difference is that our method uses an additional singular value decomposition (SVD to reduce the computational complexity of the signal reconstruction process and the sensitivity to the measurement noise.
Bayesian tomography and integrated data analysis in fusion diagnostics
Li, Dong; Dong, Y. B.; Deng, Wei; Shi, Z. B.; Fu, B. Z.; Gao, J. M.; Wang, T. B.; Zhou, Yan; Liu, Yi; Yang, Q. W.; Duan, X. R.
2016-11-01
In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varying smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.
Dynamic Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2011-01-01
Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...
Franzo, Giovanni; Cortey, Martí; de Castro, Alessandra Marnie Martins Gomes; Piovezan, Ubiratan; Szabo, Matias Pablo Juan; Drigo, Michele; Segalés, Joaquim; Richtzenhain, Leonardo José
2015-07-09
Since its discovery, Porcine circovirus type 2 has emerged as one of the most relevant swine infectious diseases, causing relevant economic losses for the pig industry. While four genotypes were identified, only three (PCV2a, PCV2b and PCV2d) are currently circulating and display a worldwide distribution. Another genotype, PCV2c, has been described only once in Danish archive samples collected between 1980 and 1990. In addition to commercial pigs, PCV2 has been demonstrated to infect wild boars and other wild species, which can potentially serve as a reservoir for domestic populations. In this study, eight sequences obtained from feral pigs in the Pantanal region (Mato Grosso do Sul State, Brazil) were compared with reference sequences and other Brazilian sequences, and the results revealed remarkable genetic diversity, with all four genotypes currently recognised being detected (PCV2a, PCV2b, PCV2c and PCV2d). This finding represents a remarkable discovery, as it is the first detection of PCV2c since 1990 and the first-ever detection of PCV2c in live animals. The peculiar population history and ecological scenario of feral pigs in the Pantanal coupled with the complex, and still only partially known relationship of feral pigs with other PCV2 susceptible species (i.e., domestic pigs, wild boars and peccaries), open exciting questions concerning PCV2 origin and evolution. Overall, the results of the present study led us to form the following hypothesis: the PCV2 strains found in feral pigs may be the last descent of the strains that circulated among European pigs in the past, or they may have infected these feral pigs more recently through a bridge species.
Institute of Scientific and Technical Information of China (English)
Giulio Garaffa; Salvatore Sansalone; David J Ralph
2013-01-01
During the most recent years,a variety of new techniques of penile reconstruction have been described in the literature.This paper focuses on the most recent advances in male genital reconstruction after trauma,excision of benign and malignant disease,in gender reassignment surgery and aphallia with emphasis on surgical technique,cosmetic and functional outcome.
Sahin, Sefa; Yildirim, Cengiz; Akif Sarikaya, Mehmet; Tuysuz, Okan; Genc, S. Can; Ersen Aksoy, Murat; Ertekin Doksanalti, Mustafa
2016-04-01
Cosmogenic surface exposure dating is based on the production of rare nuclides in exposed rocks, which interact with cosmic rays. Through modelling of measured 36Cl concentrations, we might obtain information of the history of the earthquake activity. Yet, there are several factors which may impact production of rare nuclides such as geometry of the fault, topography, geographic location of the study area, temporal variations of the Earth's magnetic field, self-cover and denudation rate on the scarp. Recently developed models provides a method to infer timing of earthquakes and slip rates on limited scales by taking into account these parameters. Our study area, the Knidos Fault Zone, is located on the Datça Peninsula in Southwestern Anatolia and contains several normal fault scarps formed within the limestone, which are appropriate to generate cosmogenic chlorine-36 (36Cl) dating models. Since it has a well-preserved scarp, we have focused on the Mezarlık Segment of the fault zone, which has an average length of 300 m and height 12-15 m. 128 continuous samples from top to bottom of the fault scarp were collected to carry out analysis of cosmic 36Cl isotopes concentrations. The main purpose of this study is to analyze factors affecting the production rates and amount of cosmogenic 36Cl nuclides concentration. Concentration of Cl36 isotopes are measured by AMS laboratories. Through the local production rates and concentration of the cosmic isotopes, we can calculate exposure ages of the samples. Recent research elucidated each step of the application of this method by the Matlab programming language (e.g. Schlagenhauf et al., 2010). It is vitally helpful to generate models of Quaternary activity of the normal faults. We, however, wanted to build a user-friendly program through an open source programing language "R" (GNU Project) that might be able to help those without knowledge of complex math programming, making calculations as easy and understandable as
Harwood, D. M.; Porter, N.; OConnell, S.
2014-12-01
A new paleobiological proxy for Antarctic paleoclimate history provides insight into past extent of open marine shelves on Wilkes Land margin, and calls for reassessment of IRD interpretations in the deep-sea. Marine, epiphytic benthic diatoms that grow attached to macroalgae (seaweed) are recovered in Miocene sediment from DSDP Site 269. They suggest periodic presence of floating rafts or 'biotic oases' in the Southern Ocean comprising buoyant macroalgae, attached benthic diatoms, and biota associated with this displaced coastal community. Macroalgae attach to the substrate with a holdfast, a multi-fingered structure that serves as an anchor. Uprooted holdfasts attached to buoyant macroalgae can raft sedimentary particles, some large (>50 kg), into the deep-sea. In addition, a rich biota of associated invertebrates live in cavities within the holdfast, the dispersal of which may explain the biogeographic distribution of organisms on Subantarctic islands. The stratigraphic occurrence of large, benthic epiphytic diatoms of genera Arachnoidiscus, Isthmia, Rhabdonema, Gephyra, Trigonium, and smaller Achnanthes, Cocconeis, Grammatophora, and Rhaphoneis in sediment cores from DSDP Site 269 reflect a rich, productive epiphytic diatom flora that maintained its position in the photic zone attached to their buoyant seaweed hosts. Amphipods and other herbivores grazed the benthic diatoms and produced diatom-rich fecal pellets that were delivered to the sea-floor. The discontinuous stratigraphic occurrence of the epiphytic diatoms, amongst the background of planktonic diatoms in Core 9 of DSDP Site 269, suggests environmental changes induced by either warm or cold events may have controlled the production and/or release of the macroalgae into the deep-sea. Warm events led to increased shelf areas, and cold events led to formation of ice on the macroalgae to increase their buoyancy and lift-off. Complicating the distinction between warm and cold events is the potential for the
The subjectivity of scientists and the Bayesian statistical approach
Press, James S
2001-01-01
Comparing and contrasting the reality of subjectivity in the work of history's great scientists and the modern Bayesian approach to statistical analysisScientists and researchers are taught to analyze their data from an objective point of view, allowing the data to speak for themselves rather than assigning them meaning based on expectations or opinions. But scientists have never behaved fully objectively. Throughout history, some of our greatest scientific minds have relied on intuition, hunches, and personal beliefs to make sense of empirical data-and these subjective influences have often a
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Bayesian microsaccade detection
Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji
2017-01-01
Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483
Maximum margin Bayesian network classifiers.
Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian
2012-03-01
We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.
Glickel, Steven Z; Gupta, Salil
2006-05-01
Volar ligament reconstruction is an effective technique for treating symptomatic laxity of the CMC joint of the thumb. The laxity may bea manifestation of generalized ligament laxity,post-traumatic, or metabolic (Ehler-Danlos). There construction reduces the shear forces on the joint that contribute to the development and persistence of inflammation. Although there have been only a few reports of the results of volar ligament reconstruction, the use of the procedure to treat Stage I and Stage II disease gives good to excellent results consistently. More advanced stages of disease are best treated by trapeziectomy, with or without ligament reconstruction.
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Di Nardo, Antonello; Knowles, Nick J; Wadsworth, Jemma; Haydon, Daniel T; King, Donald P
2014-01-01
Reconstructing the evolutionary history, demographic signal and dispersal processes from viral genome sequences contributes to our understanding of the epidemiological dynamics underlying epizootic events. In this study, a Bayesian phylogenetic framework was used to explore the phylodynamics and spatio-temporal dispersion of the O CATHAY topotype of foot-and-mouth disease virus (FMDV) that caused epidemics in the Philippines between 1994 and 2005. Sequences of the FMDV genome encoding the VP1 showed that the O CATHAY FMD epizootic in the Philippines resulted from a single introduction and was characterised by three main transmission hubs in Rizal, Bulacan and Manila Provinces. From a wider regional perspective, phylogenetic reconstruction of all available O CATHAY VP1 nucleotide sequences identified three distinct sub-lineages associated with country-based clusters originating in Hong Kong Special Administrative Region (SAR), the Philippines and Taiwan. The root of this phylogenetic tree was located in Hong Kong SAR, representing the most likely source for the introduction of this lineage into the Philippines and Taiwan. The reconstructed O CATHAY phylodynamics revealed three chronologically distinct evolutionary phases, culminating in a reduction in viral diversity over the final 10 years. The analysis suggests that viruses from the O CATHAY topotype have been continually maintained within swine industries close to Hong Kong SAR, following the extinction of virus lineages from the Philippines and the reduced number of FMD cases in Taiwan.
Progress on Bayesian Inference of the Fast Ion Distribution Function
DEFF Research Database (Denmark)
Stagner, L.; Heidbrink, W.W,; Chen, X.;
2013-01-01
The fast-ion distribution function (DF) has a complicated dependence on several phase-space variables. The standard analysis procedure in energetic particle research is to compute the DF theoretically, use that DF in forward modeling to predict diagnostic signals, then compare with measured data...... sensitivity of the measurements are incorporated into Bayesian likelihood probabilities. Prior probabilities describe physical constraints. This poster will show reconstructions of classically described, low-power, MHD-quiescent distribution functions from actual FIDA measurements. A description of the full...
A Bayesian Network View on Nested Effects Models
Directory of Open Access Journals (Sweden)
Fröhlich Holger
2009-01-01
Full Text Available Nested effects models (NEMs are a class of probabilistic models that were designed to reconstruct a hidden signalling structure from a large set of observable effects caused by active interventions into the signalling pathway. We give a more flexible formulation of NEMs in the language of Bayesian networks. Our framework constitutes a natural generalization of the original NEM model, since it explicitly states the assumptions that are tacitly underlying the original version. Our approach gives rise to new learning methods for NEMs, which have been implemented in the /Bioconductor package nem. We validate these methods in a simulation study and apply them to a synthetic lethality dataset in yeast.
Miró-Herrans, Aida T; Al-Meeri, Ali; Mulligan, Connie J
2014-01-01
Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation) or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102) and mean and median distances of migration (96 km and 26 km) for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing country and these
Directory of Open Access Journals (Sweden)
Aida T Miró-Herrans
Full Text Available Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102 and mean and median distances of migration (96 km and 26 km for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing
Miró-Herrans, Aida T.; Al-Meeri, Ali; Mulligan, Connie J.
2014-01-01
Population migration has played an important role in human evolutionary history and in the patterning of human genetic variation. A deeper and empirically-based understanding of human migration dynamics is needed in order to interpret genetic and archaeological evidence and to accurately reconstruct the prehistoric processes that comprise human evolutionary history. Current empirical estimates of migration include either short time frames (i.e. within one generation) or partial knowledge about migration, such as proportion of migrants or distance of migration. An analysis of migration that includes both proportion of migrants and distance, and direction over multiple generations would better inform prehistoric reconstructions. To evaluate human migration, we use GPS coordinates from the place of residence of the Yemeni individuals sampled in our study, their birthplaces and their parents' and grandparents' birthplaces to calculate the proportion of migrants, as well as the distance and direction of migration events between each generation. We test for differences in these values between the generations and identify factors that influence the probability of migration. Our results show that the proportion and distance of migration between females and males is similar within generations. In contrast, the proportion and distance of migration is significantly lower in the grandparents' generation, most likely reflecting the decreasing effect of technology. Based on our results, we calculate the proportion of migration events (0.102) and mean and median distances of migration (96 km and 26 km) for the grandparent's generation to represent early times in human evolution. These estimates can serve to set parameter values of demographic models in model-based methods of prehistoric reconstruction, such as approximate Bayesian computation. Our study provides the first empirically-based estimates of human migration over multiple generations in a developing country and these
National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Paleoclimatology Program archives reconstructions of past climatic conditions derived from paleoclimate proxies, in addition to the Program's large holdings...
... senos Preguntas Para el Médico Datos Para la Vida Komen El cuidado de sus senos:Consejos útiles ... that can help . Federal law requires most insurance plans cover the cost of breast reconstruction. Learn more ...
Bayesian Missile System Reliability from Point Estimates
2014-10-28
OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Bayesian Missile System Reliability from Point Estimates 5a. CONTRACT...Principle (MEP) to convert point estimates to probability distributions to be used as priors for Bayesian reliability analysis of missile data, and...illustrate this approach by applying the priors to a Bayesian reliability model of a missile system. 15. SUBJECT TERMS priors, Bayesian , missile
Perception, illusions and Bayesian inference.
Nour, Matthew M; Nour, Joseph M
2015-01-01
Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
A Bayesian Nonparametric Approach to Test Equating
Karabatsos, George; Walker, Stephen G.
2009-01-01
A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have b...
Quantum System Identification by Bayesian Analysis of Noisy Data: Beyond Hamiltonian Tomography
Schirmer, S G
2009-01-01
We consider how to characterize the dynamics of a quantum system from a restricted set of initial states and measurements using Bayesian analysis. Previous work has shown that Hamiltonian systems can be well estimated from analysis of noisy data. Here we show how to generalize this approach to systems with moderate dephasing in the eigenbasis of the Hamiltonian. We illustrate the process for a range of three-level quantum systems. The results suggest that the Bayesian estimation of the frequencies and dephasing rates is generally highly accurate and the main source of errors are errors in the reconstructed Hamiltonian basis.
Bayesian stable isotope mixing models
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...
Naive Bayesian for Email Filtering
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
The paper presents a method of email filter based on Naive Bayesian theory that can effectively filter junk mail and illegal mail. Furthermore, the keys of implementation are discussed in detail. The filtering model is obtained from training set of email. The filtering can be done without the users specification of filtering rules.
Bayesian analysis of binary sequences
Torney, David C.
2005-03-01
This manuscript details Bayesian methodology for "learning by example", with binary n-sequences encoding the objects under consideration. Priors prove influential; conformable priors are described. Laplace approximation of Bayes integrals yields posterior likelihoods for all n-sequences. This involves the optimization of a definite function over a convex domain--efficiently effectuated by the sequential application of the quadratic program.
Bayesian NL interpretation and learning
Zeevat, H.
2011-01-01
Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language
ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY
Directory of Open Access Journals (Sweden)
Felipe Schneider Costa
2013-01-01
Full Text Available The naÃ¯ve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naÃ¯ve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naÃ¯ve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naÃ¯ve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
Bayesian Classification of Image Structures
DEFF Research Database (Denmark)
Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert
2009-01-01
In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...
3-D contextual Bayesian classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian Evidence and Model Selection
Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben
2014-01-01
In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.
Differentiated Bayesian Conjoint Choice Designs
Z. Sándor (Zsolt); M. Wedel (Michel)
2003-01-01
textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about
Bayesian networks for evaluation of evidence from forensic entomology.
Andersson, M Gunnar; Sundström, Anders; Lindström, Anders
2013-09-01
In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.
ACG: rapid inference of population history from recombining nucleotide sequences
Directory of Open Access Journals (Sweden)
O'Fallon Brendan D
2013-02-01
Full Text Available Abstract Background Reconstruction of population history from genetic data often requires Monte Carlo integration over the genealogy of the samples. Among tools that perform such computations, few are able to consider genetic histories including recombination events, precluding their use on most alignments of nuclear DNA. Explicit consideration of recombinations requires modeling the history of the sequences with an Ancestral Recombination Graph (ARG in place of a simple tree, which presents significant computational challenges. Results ACG is an extensible desktop application that uses a Bayesian Markov chain Monte Carlo procedure to estimate the posterior likelihood of an evolutionary model conditional on an alignment of genetic data. The ancestry of the sequences is represented by an ARG, which is estimated from the data with other model parameters. Importantly, ACG computes the full, Felsenstein likelihood of the ARG, not a pairwise or composite likelihood. Several strategies are used to speed computations, and ACG is roughly 100x faster than a similar, recombination-aware program. Conclusions Modeling the ancestry of the sequences with an ARG allows ACG to estimate the evolutionary history of recombining nucleotide sequences. ACG can accurately estimate the posterior distribution of population parameters such as the (scaled population size and recombination rate, as well as many aspects of the recombinant history, including the positions of recombination breakpoints, the distribution of time to most recent common ancestor along the sequence, and the non-recombining trees at individual sites. Multiple substitution models and population size models are provided. ACG also provides a richly informative graphical interface that allows users to view the evolution of model parameters and likelihoods in real time.
Bayesian Alternation During Tactile Augmentation
Directory of Open Access Journals (Sweden)
Caspar Mathias Goeke
2016-10-01
Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in
Bayesian analysis of rare events
Energy Technology Data Exchange (ETDEWEB)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Reconstructing Galaxy Histories from Globular Clusters
West, M J; Marzke, R O; Jordan, A; West, Michael J.; Cote, Patrick; Marzke, Ronald O.; Jordan, Andres
2004-01-01
Nearly a century after the true nature of galaxies as distant "island universes" was established, their origin and evolution remain great unsolved problems of modern astrophysics. One of the most promising ways to investigate galaxy formation is to study the ubiquitous globular star clusters that surround most galaxies. Recent advances in our understanding of the globular cluster systems of the Milky Way and other galaxies point to a complex picture of galaxy genesis driven by cannibalism, collisions, bursts of star formation and other tumultuous events.
Reconstructing the Limfjord’s history
DEFF Research Database (Denmark)
Philippsen, Bente
and the hardwater effect. Upwelling of bottom water and mixing with surface water results in a marine reservoir age of c. 400 years in the seas around Denmark. The hardwater effect occurs in freshwater systems with a high content of dissolved minerals, with “hard water”. These contain considerable amounts of “14C-dead......The Limfjord is a sound in Northern Jutland, Denmark, connecting the North Sea with the Kattegatt. The complex interplay of eustatic sea level changes and isostatic land-rise caused the relative sea level of the region to fluctuate throughout the later part of the Holocene. Consequently, the region...... experienced periods mostly with freshwater/ brackish conditions and others with predominantly marine conditions. The changes in relative sea level resulted in a landscape which at periods were characterised as a fjord (only one connection to the sea), sometimes a sound (more than one connection to the sea...
STATISTICAL BAYESIAN ANALYSIS OF EXPERIMENTAL DATA.
Directory of Open Access Journals (Sweden)
AHLAM LABDAOUI
2012-12-01
Full Text Available The Bayesian researcher should know the basic ideas underlying Bayesian methodology and the computational tools used in modern Bayesian econometrics. Some of the most important methods of posterior simulation are Monte Carlo integration, importance sampling, Gibbs sampling and the Metropolis- Hastings algorithm. The Bayesian should also be able to put the theory and computational tools together in the context of substantive empirical problems. We focus primarily on recent developments in Bayesian computation. Then we focus on particular models. Inevitably, we combine theory and computation in the context of particular models. Although we have tried to be reasonably complete in terms of covering the basic ideas of Bayesian theory and the computational tools most commonly used by the Bayesian, there is no way we can cover all the classes of models used in econometrics. We propose to the user of analysis of variance and linear regression model.
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Zhu, Hong-Ming; Pen, Ue-Li; Chen, Xuelei; Yu, Hao-Ran
2016-01-01
We present a direct approach to non-parametrically reconstruct the linear density field from an observed non-linear map. We solve for the unique displacement potential consistent with the non-linear density and positive definite coordinate transformation using a multigrid algorithm. We show that we recover the linear initial conditions up to $k\\sim 1\\ h/\\mathrm{Mpc}$ with minimal computational cost. This reconstruction approach generalizes the linear displacement theory to fully non-linear fields, potentially substantially expanding the BAO and RSD information content of dense large scale structure surveys, including for example SDSS main sample and 21cm intensity mapping.
Bayesian versus 'plain-vanilla Bayesian' multitarget statistics
Mahler, Ronald P. S.
2004-08-01
Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."
ACL reconstruction - discharge
Anterior cruciate ligament reconstruction - discharge; ACL reconstruction - discharge ... had surgery to reconstruct your anterior cruciate ligament (ACL). The surgeon drilled holes in the bones of ...
Bayesian priors for transiting planets
Kipping, David M
2016-01-01
As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...
Bayesian approach to rough set
Marwala, Tshilidzi
2007-01-01
This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.
Deep Learning and Bayesian Methods
Directory of Open Access Journals (Sweden)
Prosper Harrison B.
2017-01-01
Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.
Deep Learning and Bayesian Methods
Prosper, Harrison B.
2017-03-01
A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.
Bayesian Source Separation and Localization
Knuth, K H
1998-01-01
The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...
Bayesian Inference for Radio Observations
Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin
2015-01-01
(Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...
Bayesian inference on proportional elections.
Directory of Open Access Journals (Sweden)
Gabriel Hideki Vatanabe Brunello
Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.
Denoising Message Passing for X-ray Computed Tomography Reconstruction
Perelli, Alessandro; Can, Ali; Davies, Mike E
2016-01-01
X-ray Computed Tomography (CT) reconstruction from sparse number of views is becoming a powerful way to reduce either the radiation dose or the acquisition time in CT systems but still requires a huge computational time. This paper introduces an approximate Bayesian inference framework for CT reconstruction based on a family of denoising approximate message passing (DCT-AMP) algorithms able to improve both the convergence speed and the reconstruction quality. Approximate Message Passing for Compressed Sensing has been extensively analysed for random linear measurements but there are still not clear solutions on how AMP should be modified and how it performs with real world problems. In particular to overcome the convergence issues of DCT-AMP with structured measurement matrices, we propose a disjoint preconditioned version of the algorithm tailored for both the geometric system model and the noise model. In addition the Bayesian DCT-AMP formulation allows to measure how the current estimate is close to the pr...
Bayesian analysis for kaon photoproduction
Energy Technology Data Exchange (ETDEWEB)
Marsainy, T., E-mail: tmart@fisika.ui.ac.id; Mart, T., E-mail: tmart@fisika.ui.ac.id [Department Fisika, FMIPA, Universitas Indonesia, Depok 16424 (Indonesia)
2014-09-25
We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.
Bayesian priors and nuisance parameters
Gupta, Sourendu
2016-01-01
Bayesian techniques are widely used to obtain spectral functions from correlators. We suggest a technique to rid the results of nuisance parameters, ie, parameters which are needed for the regularization but cannot be determined from data. We give examples where the method works, including a pion mass extraction with two flavours of staggered quarks at a lattice spacing of about 0.07 fm. We also give an example where the method does not work.
Elements of Bayesian experimental design
Energy Technology Data Exchange (ETDEWEB)
Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)
1997-09-01
We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
. This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....
Wahl, E. R.; Cook, E.; Diaz, H. F.; Meko, D. M.
2012-12-01
similar reconstructions that are planned using Bayesian hierarchical modeling methods.
Bayesian coestimation of phylogeny and sequence alignment
Directory of Open Access Journals (Sweden)
Jensen Jens
2005-04-01
Full Text Available Abstract Background Two central problems in computational biology are the determination of the alignment and phylogeny of a set of biological sequences. The traditional approach to this problem is to first build a multiple alignment of these sequences, followed by a phylogenetic reconstruction step based on this multiple alignment. However, alignment and phylogenetic inference are fundamentally interdependent, and ignoring this fact leads to biased and overconfident estimations. Whether the main interest be in sequence alignment or phylogeny, a major goal of computational biology is the co-estimation of both. Results We developed a fully Bayesian Markov chain Monte Carlo method for coestimating phylogeny and sequence alignment, under the Thorne-Kishino-Felsenstein model of substitution and single nucleotide insertion-deletion (indel events. In our earlier work, we introduced a novel and efficient algorithm, termed the "indel peeling algorithm", which includes indels as phylogenetically informative evolutionary events, and resembles Felsenstein's peeling algorithm for substitutions on a phylogenetic tree. For a fixed alignment, our extension analytically integrates out both substitution and indel events within a proper statistical model, without the need for data augmentation at internal tree nodes, allowing for efficient sampling of tree topologies and edge lengths. To additionally sample multiple alignments, we here introduce an efficient partial Metropolized independence sampler for alignments, and combine these two algorithms into a fully Bayesian co-estimation procedure for the alignment and phylogeny problem. Our approach results in estimates for the posterior distribution of evolutionary rate parameters, for the maximum a-posteriori (MAP phylogenetic tree, and for the posterior decoding alignment. Estimates for the evolutionary tree and multiple alignment are augmented with confidence estimates for each node height and alignment column
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Breast Reconstruction Alternatives
... Breast Reconstruction Surgery Breast Cancer Breast Reconstruction Surgery Breast Reconstruction Alternatives Some women who have had a ... chest. What if I choose not to get breast reconstruction? Some women decide not to have any ...
Institute of Scientific and Technical Information of China (English)
2009-01-01
Eighty percent of the reconstruction projects in Sichuan Province will be completed by the end of the year Despite ruins still seen everywhere in the earthquake-hit areas in Sichuan (Province, new buildings have been completed, and many people have moved into new houses. Through cameras of the media, the faces, once painful and melancholy after last year’s earthquake, now look confident and firm, gratifying people all over the
Bayesian Inversion of Seabed Scattering Data
2014-09-30
Bayesian Inversion of Seabed Scattering Data (Special Research Award in Ocean Acoustics) Gavin A.M.W. Steininger School of Earth & Ocean...project are to carry out joint Bayesian inversion of scattering and reflection data to estimate the in-situ seabed scattering and geoacoustic parameters...valid OMB control number. 1. REPORT DATE 30 SEP 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Bayesian
Anomaly Detection and Attribution Using Bayesian Networks
2014-06-01
UNCLASSIFIED Anomaly Detection and Attribution Using Bayesian Networks Andrew Kirk, Jonathan Legg and Edwin El-Mahassni National Security and...detection in Bayesian networks , en- abling both the detection and explanation of anomalous cases in a dataset. By exploiting the structure of a... Bayesian network , our algorithm is able to efficiently search for local maxima of data conflict between closely related vari- ables. Benchmark tests using
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...
Directory of Open Access Journals (Sweden)
Brown James
2007-12-01
Full Text Available This article aims to discuss the various defects that occur with maxillectomy with a full review of the literature and discussion of the advantages and disadvantages of the various techniques described. Reconstruction of the maxilla can be relatively simple for the standard low maxillectomy that does not involve the orbital floor (Class 2. In this situation the structure of the face is less damaged and the there are multiple reconstructive options for the restoration of the maxilla and dental alveolus. If the maxillectomy includes the orbit (Class 4 then problems involving the eye (enopthalmos, orbital dystopia, ectropion and diplopia are avoided which simplifies the reconstruction. Most controversy is associated with the maxillectomy that involves the orbital floor and dental alveolus (Class 3. A case is made for the use of the iliac crest with internal oblique as an ideal option but there are other methods, which may provide a similar result. A multidisciplinary approach to these patients is emphasised which should include a prosthodontist with a special expertise for these defects.
SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE
Institute of Scientific and Technical Information of China (English)
Ming HAN; Yuanyao DING
2004-01-01
This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.
Learning dynamic Bayesian networks with mixed variables
DEFF Research Database (Denmark)
Bøttcher, Susanne Gammelgaard
This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....
Variational bayesian method of estimating variance components.
Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi
2016-07-01
We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.
Institute of Scientific and Technical Information of China (English)
李亚军; 李儒峰; 陈莉琼; 宋宁; 方晶
2011-01-01
Based on the analysis of the vitrinite reflectance and apatite fission track inclusions system testing, we carried out the calculation of paleotemperature gradient and reconstruction of thermal history , and then identified the paleotemperature gradient of west slope and Bianminyang tectonic zone of Jinhu depression. According to the vitrinite reflectance, we calculated that the range of the paleotemperature being between 45.6 ～128.4℃ and the paleotemperature gradient was 45.5 ℃/km in the west slope, the paleotemperature in Bianminyang tectonic zone was 26.4 ～120.3℃ and the paleotemperature gradient was 42.7℃/km. According to the apatite fission track, we calculated that the paleotemperature gradient in the west slope was 40.7℃/km, and in Bianminyang tectonic zone was 45.8℃/km.From the comparative analysis with different tectonic zones of Jinhu depression, we concluded a law that the paleo temperature gradient was higher than present-day geothermal gradient, specifically as follows: in the west slope, paleotemperature was 10.4 ～ 15.2℃/kin higher than the current, and in Bianminyang tectonic zone paleotemperature was 12.4 ～ 15.3℃/km higher than the present. By the thermal history modeling of typical wells in west slope and Bianminyang tectonic zone, it could be seen that the paleo-geothermal gradient became lower with the stratigraphical time changed for the new. It shows that the geothermal gradient of K2t ～ E1fwas higher than E2d ～ Ny. Before the uplift and erosion caused by the Sanduo tectonic events, the paleo-temperature of depression had reached the maximum. The maturity history of depression reflected that the R0 was 0. 4％ in the depth of 1 000 m of Jinhu depression. The source rock was at the low-mature stage. The R0 was 0.65％ in the depth of 1 900 m and the temperature reached 90℃, the hydrocarbon source rocks entered the peak phase. The homogenization temperature of fluid inclusion samples was between 62 ～ 93℃ of Well
Bayesian wavelet PCA methodology for turbomachinery damage diagnosis under uncertainty
Xu, Shengli; Jiang, Xiaomo; Huang, Jinzhi; Yang, Shuhua; Wang, Xiaofang
2016-12-01
Centrifugal compressor often suffers various defects such as impeller cracking, resulting in forced outage of the total plant. Damage diagnostics and condition monitoring of such a turbomachinery system has become an increasingly important and powerful tool to prevent potential failure in components and reduce unplanned forced outage and further maintenance costs, while improving reliability, availability and maintainability of a turbomachinery system. This paper presents a probabilistic signal processing methodology for damage diagnostics using multiple time history data collected from different locations of a turbomachine, considering data uncertainty and multivariate correlation. The proposed methodology is based on the integration of three advanced state-of-the-art data mining techniques: discrete wavelet packet transform, Bayesian hypothesis testing, and probabilistic principal component analysis. The multiresolution wavelet analysis approach is employed to decompose a time series signal into different levels of wavelet coefficients. These coefficients represent multiple time-frequency resolutions of a signal. Bayesian hypothesis testing is then applied to each level of wavelet coefficient to remove possible imperfections. The ratio of posterior odds Bayesian approach provides a direct means to assess whether there is imperfection in the decomposed coefficients, thus avoiding over-denoising. Power spectral density estimated by the Welch method is utilized to evaluate the effectiveness of Bayesian wavelet cleansing method. Furthermore, the probabilistic principal component analysis approach is developed to reduce dimensionality of multiple time series and to address multivariate correlation and data uncertainty for damage diagnostics. The proposed methodology and generalized framework is demonstrated with a set of sensor data collected from a real-world centrifugal compressor with impeller cracks, through both time series and contour analyses of vibration
Inference of gene pathways using mixture Bayesian networks
Directory of Open Access Journals (Sweden)
Ko Younhee
2009-05-01
Full Text Available Abstract Background Inference of gene networks typically relies on measurements across a wide range of conditions or treatments. Although one network structure is predicted, the relationship between genes could vary across conditions. A comprehensive approach to infer general and condition-dependent gene networks was evaluated. This approach integrated Bayesian network and Gaussian mixture models to describe continuous microarray gene expression measurements, and three gene networks were predicted. Results The first reconstructions of a circadian rhythm pathway in honey bees and an adherens junction pathway in mouse embryos were obtained. In addition, general and condition-specific gene relationships, some unexpected, were detected in these two pathways and in a yeast cell-cycle pathway. The mixture Bayesian network approach identified all (honey bee circadian rhythm and mouse adherens junction pathways or the vast majority (yeast cell-cycle pathway of the gene relationships reported in empirical studies. Findings across the three pathways and data sets indicate that the mixture Bayesian network approach is well-suited to infer gene pathways based on microarray data. Furthermore, the interpretation of model estimates provided a broader understanding of the relationships between genes. The mixture models offered a comprehensive description of the relationships among genes in complex biological processes or across a wide range of conditions. The mixture parameter estimates and corresponding odds that the gene network inferred for a sample pertained to each mixture component allowed the uncovering of both general and condition-dependent gene relationships and patterns of expression. Conclusion This study demonstrated the two main benefits of learning gene pathways using mixture Bayesian networks. First, the identification of the optimal number of mixture components supported by the data offered a robust approach to infer gene relationships and
Reconstructing Variations of Global Sea-Surface Temperature during the Last Interglaciation
Hoffman, J. S.; Clark, P. U.; He, F.; Parnell, A. C.
2015-12-01
The last interglaciation (LIG; ~130-116 ka) was the most recent period in Earth history with higher-than-present global sea level (≥6 m) under similar-to-preindustrial concentrations of atmospheric CO2, suggesting additional feedbacks related to albedo, insolation, and ocean circulation in generating the apparent climatic differences between the LIG and present Holocene. However, our understanding of how much warmer the LIG sea surface was relative to the present interglaciation remains uncertain, with current estimates suggesting from 0°C to 2°C warmer than late-20thcentury average global temperatures. Moreover, the timing, spatial expression, and amplitude of regional and global sea surface temperature variability related to other climate forcing during the LIG are poorly constrained, largely due to uncertainties in age control and proxy temperature reconstructions. An accurate characterization of global and regional temperature change during the LIG can serve as a benchmark for paleoclimate modeling intercomparison projects and help improve understanding of sea-level sensitivity to temperature change. We will present a global compilation (~100 published records) of sea surface temperature (SST) and other climate reconstructions spanning the LIG. Using a Monte Carlo-enabled cross-correlation maximization algorithm to climatostratigraphically align proxy records and then account for both the resulting chronologic and proxy calibration uncertainties with Bayesian statistical inference, our results quantify the spatial timing, amplitude, and uncertainty in estimates of global and regional sea surface temperature change during the LIG and its relation to potential forcings.
Institute of Scientific and Technical Information of China (English)
Fu Xiaoqiang
2006-01-01
@@ The Karzai regime has made some progress over the past four years and a half in the post-war reconstruction.However, Taliban's destruction and drug economy are still having serious impacts on the security and stability of Afghanistan.Hence the settlement of the two problems has become a crux of affecting the country' s future.Moreover, the Karzai regime is yet to handle a series of hot potatoes in the fields of central government' s authority, military and police building-up and foreign relations as well.
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Bayesian Query-Focused Summarization
Daumé, Hal
2009-01-01
We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian homeopathy: talking normal again.
Rutten, A L B
2007-04-01
Homeopathy has a communication problem: important homeopathic concepts are not understood by conventional colleagues. Homeopathic terminology seems to be comprehensible only after practical experience of homeopathy. The main problem lies in different handling of diagnosis. In conventional medicine diagnosis is the starting point for randomised controlled trials to determine the effect of treatment. In homeopathy diagnosis is combined with other symptoms and personal traits of the patient to guide treatment and predict response. Broadening our scope to include diagnostic as well as treatment research opens the possibility of multi factorial reasoning. Adopting Bayesian methodology opens the possibility of investigating homeopathy in everyday practice and of describing some aspects of homeopathy in conventional terms.
Bayesian credible interval construction for Poisson statistics
Institute of Scientific and Technical Information of China (English)
ZHU Yong-Sheng
2008-01-01
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.
Advances in Bayesian Modeling in Educational Research
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Nonparametric Bayesian Modeling of Complex Networks
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard; Mørup, Morten
2013-01-01
Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....
Modeling Diagnostic Assessments with Bayesian Networks
Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego
2007-01-01
This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
The Bayesian Revolution Approaches Psychological Development
Shultz, Thomas R.
2007-01-01
This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Bayesian Network for multiple hypthesis tracking
W.P. Zajdel; B.J.A. Kröse
2002-01-01
For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ
Hepatitis disease detection using Bayesian theory
Maseleno, Andino; Hidayati, Rohmah Zahroh
2017-02-01
This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.
2nd Bayesian Young Statisticians Meeting
Bitto, Angela; Kastner, Gregor; Posekany, Alexandra
2015-01-01
The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.
Modelling of JET diagnostics using Bayesian Graphical Models
Energy Technology Data Exchange (ETDEWEB)
Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.
2011-07-01
The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This
Divergence history of the Carpathian and smooth newts modelled in space and time.
Zieliński, P; Nadachowska-Brzyska, K; Dudek, K; Babik, W
2016-08-01
Information about demographic history is essential for the understanding of the processes of divergence and speciation. Patterns of genetic variation within and between closely related species provide insights into the history of their interactions. Here, we investigated historical demography and genetic exchange between the Carpathian (Lissotriton montandoni, Lm) and smooth (L. vulgaris, Lv) newts. We combine an extensive geographical sampling and multilocus nuclear sequence data with the approximate Bayesian computation framework to test alternative scenarios of divergence and reconstruct the temporal and spatial pattern of gene flow between species. A model of recent (last glacial period) interspecific gene flow was favoured over alternative models. Thus, despite the relatively old divergence (4-6 mya) and presumably long periods of isolation, the species have retained the ability to exchange genes. Nevertheless, the low migration rates (ca. 10(-6) per gene copy per generation) are consistent with strong reproductive isolation between the species. Models allowing demographic changes were favoured, suggesting that the effective population sizes of both species at least doubled as divergence reaching the current ca. 0.2 million in Lm and 1 million in Lv. We found asymmetry in rates of interspecific gene flow between Lm and one evolutionary lineage of Lv. We suggest that intraspecific polymorphism for hybrid incompatibilities segregating within Lv could explain this pattern and propose further tests to distinguish between alternative explanations. Our study highlights the importance of incorporating intraspecific genetic structure into the models investigating the history of divergence.
Group Tracking of Space Objects within Bayesian Framework
Directory of Open Access Journals (Sweden)
Huang Jian
2013-03-01
Full Text Available It is imperative to efficiently track and catalogue the extensive dense group space objects for space surveillance. As the main instrument for Low Earth Orbit (LEO space surveillance, ground-based radar system is usually limited by its resolving power while tracking the small space debris with high dense population. Thus, the obtained information about target detection and observation will be seriously missed, which makes the traditional tracking method inefficient. Therefore, we conceived the concept of group tracking. The overall motional tendency of the group objects is particularly focused, while the individual object is simultaneously tracked in effect. The tracking procedure is based on the Bayesian frame. According to the restriction among the group center and observations of multi-targets, the reconstruction of targets’ number and estimation of individual trajectory can be greatly improved on the accuracy and robustness in the case of high miss alarm. The Markov Chain Monte Carlo Particle (MCMC-Particle algorism is utilized for solving the Bayesian integral problem. Finally, the simulation of the group space objects tracking is carried out to validate the efficiency of the proposed method.
A Bayesian analysis of regularised source inversions in gravitational lensing
Suyu, S H; Hobson, M P; Marshall, P J
2006-01-01
Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...
Bayesian redshift-space distortions correction from galaxy redshift surveys
Kitaura, Francisco-Shu; Angulo, Raul E; Chuang, Chia-Hsun; Rodriguez-Torres, Sergio; Monteagudo, Carlos Hernandez; Prada, Francisco; Yepes, Gustavo
2015-01-01
We present a Bayesian reconstruction method which maps a galaxy distribution from redshift-space to real-space inferring the distances of the individual galaxies. The method is based on sampling density fields assuming a lognormal prior with a likelihood given by the negative binomial distribution function modelling stochastic bias. We assume a deterministic bias given by a power law relating the dark matter density field to the expected halo or galaxy field. Coherent redshift-space distortions are corrected in a Gibbs-sampling procedure by moving the galaxies from redshift-space to real-space according to the peculiar motions derived from the recovered density field using linear theory with the option to include tidal field corrections from second order Lagrangian perturbation theory. The virialised distortions are corrected by sampling candidate real-space positions (being in the neighbourhood of the observations along the line of sight), which are compatible with the bulk flow corrected redshift-space posi...
CURRENT CONCEPTS IN ACL RECONSTRUCTION
Directory of Open Access Journals (Sweden)
Freddie H. Fu
2008-09-01
Full Text Available Current Concepts in ACL Reconstruction is a complete reference text composed of the most thorough collection of topics on the ACL and its surgical reconstruction compiled, with contributions from some of the world's experts and most experienced ACL surgeons. Various procedures mentioned throughout the text are also demonstrated in an accompanying video CD-ROM. PURPOSE Composing a single, comprehensive and complete information source on ACL including basic sciences, clinical issues, latest concepts and surgical techniques, from evaluation to outcome, from history to future, editors and contributors have targeted to keep the audience pace with the latest concepts and techniques for the evaluation and the treatment of ACL injuries. FEATURES The text is composed of 27 chapters in 6 sections. The first section is mostly about basic sciences, also history of the ACL, imaging, clinical approach to adolescent and pediatric patients are subjected. In the second section, Graft Choices and Arthroscopy Portals for ACL Reconstruction are mentioned. The third section is about the technique and the outcome of the single-bundle ACL reconstruction. The fourth chapter includes the techniques and outcome of the double-bundle ACL reconstruction. In the fifth chapter revision, navigation technology, rehabilitation and the evaluation of the outcome of ACL reconstruction is subjected. The sixth/the last chapter is about the future advances to reach: What We Have Learned and the Future of ACL Reconstruction. AUDIENCE Orthopedic residents, sports traumatology and knee surgery fellows, orthopedic surgeons, also scientists in basic sciences or clinicians who are studying or planning a research on ACL forms the audience group of this book. ASSESSMENT This is the latest, the most complete and comprehensive textbook of ACL reconstruction produced by the editorial work up of two pioneer and masters "Freddie H. Fu MD and Steven B. Cohen MD" with the contribution of world
Quantum Bayesianism at the Perimeter
Fuchs, Christopher A
2010-01-01
The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.
On Bayesian System Reliability Analysis
Energy Technology Data Exchange (ETDEWEB)
Soerensen Ringi, M.
1995-05-01
The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.
Hedging Strategies for Bayesian Optimization
Brochu, Eric; de Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.
Nonparametric Bayesian inference in biostatistics
Müller, Peter
2015-01-01
As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...
Bayesian Inference with Optimal Maps
Moselhy, Tarek A El
2011-01-01
We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selectio...
Multiview Bayesian Correlated Component Analysis
DEFF Research Database (Denmark)
Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai
2015-01-01
Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks are iden......Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....
Bayesian networks in educational assessment
Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M
2015-01-01
Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...
Elvira, Clément; Dobigeon, Nicolas
2015-01-01
Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...
Bayesian Inference in Queueing Networks
Sutton, Charles
2010-01-01
Modern Web services, such as those at Google, Yahoo!, and Amazon, handle billions of requests per day on clusters of thousands of computers. Because these services operate under strict performance requirements, a statistical understanding of their performance is of great practical interest. Such services are modeled by networks of queues, where one queue models each of the individual computers in the system. A key challenge is that the data is incomplete, because recording detailed information about every request to a heavily used system can require unacceptable overhead. In this paper we develop a Bayesian perspective on queueing models in which the arrival and departure times that are not observed are treated as latent variables. Underlying this viewpoint is the observation that a queueing model defines a deterministic transformation between the data and a set of independent variables called the service times. With this viewpoint in hand, we sample from the posterior distribution over missing data and model...
A Bayesian Reflection on Surfaces
Directory of Open Access Journals (Sweden)
David R. Wolf
1999-10-01
Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.
Computational Imaging for VLBI Image Reconstruction
Bouman, Katherine L; Zoran, Daniel; Fish, Vincent L; Doeleman, Sheperd S; Freeman, William T
2015-01-01
Very long baseline interferometry (VLBI) is a technique for imaging celestial radio emissions by simultaneously observing a source from telescopes distributed across Earth. The challenges in reconstructing images from fine angular resolution VLBI data are immense. The data is extremely sparse and noisy, thus requiring statistical image models such as those designed in the computer vision community. In this paper we present a novel Bayesian approach for VLBI image reconstruction. While other methods require careful tuning and parameter selection for different types of images, our method is robust and produces good results under different settings such as low SNR or extended emissions. The success of our method is demonstrated on realistic synthetic experiments as well as publicly available real data. We present this problem in a way that is accessible to members of the computer vision community, and provide a dataset website (vlbiimaging.csail.mit.edu) to allow for controlled comparisons across algorithms. Thi...
Directory of Open Access Journals (Sweden)
Fikret Fatih Önol
2014-11-01
Full Text Available In the treatment of urethral stricture, Buccal Mucosa Graft (BMG and reconstruction is applied with different patch techniques. Recently often prefered, this approach is, in bulber urethra strictures of BMG’s; by “ventral onley”, in pendulous urethra because of thinner spingiosis body, which provides support and nutrition of graft; by means of “dorsal inley” being anastomosis. In the research that Cordon et al. did, they compared conventional BMJ “onley” urethroplast and “pseudo-spongioplasty” which base on periurethral vascular tissues to be nourished by closing onto graft. In repairment of front urethras that spongiosis supportive tissue is insufficient, this method is defined as peripheral dartos [çevre dartos?] and buck’s fascia being mobilized and being combined on BMG patch. Between the years 2007 and 2012, assessment of 56 patients with conventional “ventral onley” BMG urethroplast and 46 patients with “pseudo-spongioplasty” were reported to have similar success rates (80% to 84% in 3.5 year follow-up on average. While 74% of the patients that were applied pseudo-spongioplasty had disease present at distal urethra (pendulous, bulbopendulous, 82% of the patients which were applied conventional onley urethroplast had stricture at proximal (bulber urethra yet. Also lenght of the stricture at the pseudo-spongioplasty group was longer in a statistically significant way (5.8 cm to 4.7 cm on average, p=0.028. This study which Cordon et al. did, shows that conditions in which conventional sponjiyoplasti is not possible, periurethral vascular tissues are adequate to nourish BMG. Even it is an important technique in terms of bringing a new point of view to today’s practice, data especially about complications that may show up after pseudo-spongioplasty usage on long distal strictures (e.g. appearance of urethral diverticulum is not reported. Along with this we think that, providing an oppurtinity to patch directly
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark
2006-01-01
We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...
Li, Yifeng; Chen, Haifen; Zheng, Jie; Ngom, Alioune
2016-01-01
Accurately reconstructing gene regulatory network (GRN) from gene expression data is a challenging task in systems biology. Although some progresses have been made, the performance of GRN reconstruction still has much room for improvement. Because many regulatory events are asynchronous, learning gene interactions with multiple time delays is an effective way to improve the accuracy of GRN reconstruction. Here, we propose a new approach, called Max-Min high-order dynamic Bayesian network (MMHO-DBN) by extending the Max-Min hill-climbing Bayesian network technique originally devised for learning a Bayesian network's structure from static data. Our MMHO-DBN can explicitly model the time lags between regulators and targets in an efficient manner. It first uses constraint-based ideas to limit the space of potential structures, and then applies search-and-score ideas to search for an optimal HO-DBN structure. The performance of MMHO-DBN to GRN reconstruction was evaluated using both synthetic and real gene expression time-series data. Results show that MMHO-DBN is more accurate than current time-delayed GRN learning methods, and has an intermediate computing performance. Furthermore, it is able to learn long time-delayed relationships between genes. We applied sensitivity analysis on our model to study the performance variation along different parameter settings. The result provides hints on the setting of parameters of MMHO-DBN.
The Diagnosis of Reciprocating Machinery by Bayesian Networks
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.
Holocene flooding history of the Lower Tagus Valley (Portugal)
Vis, G.-J.; Bohncke, S.J.P.; Schneider, H.; Kasse, C.; Coenraads-Nederveen, S.; Zuurbier, K.; Rozema, J.
2010-01-01
The present paper aims to reconstruct the Lower Tagus Valley flooding history for the last ca. 6500 a, to explore the suitability of pollen-based local vegetation development in supporting the reconstruction of flooding history, and to explain fluvial activity changes in terms of allogenic (climate,
Radiation dose reduction in computed tomography perfusion using spatial-temporal Bayesian methods
Fang, Ruogu; Raj, Ashish; Chen, Tsuhan; Sanelli, Pina C.
2012-03-01
In current computed tomography (CT) examinations, the associated X-ray radiation dose is of significant concern to patients and operators, especially CT perfusion (CTP) imaging that has higher radiation dose due to its cine scanning technique. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) parameter as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and degrade CT perfusion maps greatly if no adequate noise control is applied during image reconstruction. To capture the essential dynamics of CT perfusion, a simple spatial-temporal Bayesian method that uses a piecewise parametric model of the residual function is used, and then the model parameters are estimated from a Bayesian formulation of prior smoothness constraints on perfusion parameters. From the fitted residual function, reliable CTP parameter maps are obtained from low dose CT data. The merit of this scheme exists in the combination of analytical piecewise residual function with Bayesian framework using a simpler prior spatial constrain for CT perfusion application. On a dataset of 22 patients, this dynamic spatial-temporal Bayesian model yielded an increase in signal-tonoise-ratio (SNR) of 78% and a decrease in mean-square-error (MSE) of 40% at low dose radiation of 43mA.
GPstuff: Bayesian Modeling with Gaussian Processes
Vanhatalo, J.; Riihimaki, J.; Hartikainen, J.; Jylänki, P.P.; Tolvanen, V.; Vehtari, A.
2013-01-01
The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for Bayesian inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.
Bayesian Uncertainty Analyses Via Deterministic Model
Krzysztofowicz, R.
2001-05-01
Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.
Picturing classical and quantum Bayesian inference
Coecke, Bob
2011-01-01
We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `cond...
Learning Bayesian networks for discrete data
Liang, Faming
2009-02-01
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.
An Intuitive Dashboard for Bayesian Network Inference
Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.
2014-03-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Bayesian inference of synaptic quantal parameters from correlated vesicle release
Directory of Open Access Journals (Sweden)
Alexander D Bird
2016-11-01
Full Text Available Synaptic transmission is both history-dependent and stochastic, resulting in varying responses to presentations of the same presynaptic stimulus. This complicates attempts to infer synaptic parameters and has led to the proposal of a number of different strategies for their quantification. Recently Bayesian approaches have been applied to make more efficient use of the data collected in paired intracellular recordings. Methods have been developed that either provide a complete model of the distribution of amplitudes for isolated responses or approximate the amplitude distributions of a train of post-synaptic potentials, with correct short-term synaptic dynamics but neglecting correlations. In both cases the methods provided significantly improved inference of model parameters as compared to existing mean-variance fitting approaches. However, for synapses with high release probability, low vesicle number or relatively low restock rate and for data in which only one or few repeats of the same pattern are available, correlations between serial events can allow for the extraction of significantly more information from experiment: a more complete Bayesian approach would take this into account also. This has not been possible previously because of the technical difficulty in calculating the likelihood of amplitudes seen in correlated post-synaptic potential trains; however, recent theoretical advances have now rendered the likelihood calculation tractable for a broad class of synaptic dynamics models. Here we present a compact mathematical form for the likelihood in terms of a matrix product and demonstrate how marginals of the posterior provide information on covariance of parameter distributions. The associated computer code for Bayesian parameter inference for a variety of models of synaptic dynamics is provided in the supplementary material allowing for quantal and dynamical parameters to be readily inferred from experimental data sets.
ABCtoolbox: a versatile toolkit for approximate Bayesian computations
Directory of Open Access Journals (Sweden)
Neuenschwander Samuel
2010-03-01
Full Text Available Abstract Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC. It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.
Bayesian Inference of Synaptic Quantal Parameters from Correlated Vesicle Release
Bird, Alex D.; Wall, Mark J.; Richardson, Magnus J. E.
2016-01-01
Synaptic transmission is both history-dependent and stochastic, resulting in varying responses to presentations of the same presynaptic stimulus. This complicates attempts to infer synaptic parameters and has led to the proposal of a number of different strategies for their quantification. Recently Bayesian approaches have been applied to make more efficient use of the data collected in paired intracellular recordings. Methods have been developed that either provide a complete model of the distribution of amplitudes for isolated responses or approximate the amplitude distributions of a train of post-synaptic potentials, with correct short-term synaptic dynamics but neglecting correlations. In both cases the methods provided significantly improved inference of model parameters as compared to existing mean-variance fitting approaches. However, for synapses with high release probability, low vesicle number or relatively low restock rate and for data in which only one or few repeats of the same pattern are available, correlations between serial events can allow for the extraction of significantly more information from experiment: a more complete Bayesian approach would take this into account also. This has not been possible previously because of the technical difficulty in calculating the likelihood of amplitudes seen in correlated post-synaptic potential trains; however, recent theoretical advances have now rendered the likelihood calculation tractable for a broad class of synaptic dynamics models. Here we present a compact mathematical form for the likelihood in terms of a matrix product and demonstrate how marginals of the posterior provide information on covariance of parameter distributions. The associated computer code for Bayesian parameter inference for a variety of models of synaptic dynamics is provided in the Supplementary Material allowing for quantal and dynamical parameters to be readily inferred from experimental data sets. PMID:27932970
Vrancken, Bram; Maletich Junqueira, Dennis; de Medeiros, Rúbia Marília; Suchard, Marc A.; Lemey, Philippe; Esteves de Matos Almeida, Sabrina; Pinto, Aguinaldo Roberto
2015-01-01
ABSTRACT The phylogeographic history of the Brazilian HIV-1 subtype C (HIV-1C) epidemic is still unclear. Previous studies have mainly focused on the capital cities of Brazilian federal states, and the fact that HIV-1C infections increase at a higher rate than subtype B infections in Brazil calls for a better understanding of the process of spatial spread. A comprehensive sequence data set sampled across 22 Brazilian locations was assembled and analyzed. A Bayesian phylogeographic generalized linear model approach was used to reconstruct the spatiotemporal history of HIV-1C in Brazil, considering several potential explanatory predictors of the viral diffusion process. Analyses were performed on several subsampled data sets in order to mitigate potential sample biases. We reveal a central role for the city of Porto Alegre, the capital of the southernmost state, in the Brazilian HIV-1C epidemic (HIV-1C_BR), and the northward expansion of HIV-1C_BR could be linked to source populations with higher HIV-1 burdens and larger proportions of HIV-1C infections. The results presented here bring new insights to the continuing discussion about the HIV-1C epidemic in Brazil and raise an alternative hypothesis for its spatiotemporal history. The current work also highlights how sampling bias can confound phylogeographic analyses and demonstrates the importance of incorporating external information to protect against this. IMPORTANCE Subtype C is responsible for the largest HIV infection burden worldwide, but our understanding of its transmission dynamics remains incomplete. Brazil witnessed a relatively recent introduction of HIV-1C compared to HIV-1B, but it swiftly spread throughout the south, where it now circulates as the dominant variant. The northward spread has been comparatively slow, and HIV-1B still prevails in that region. While epidemiological data and viral genetic analyses have both independently shed light on the dynamics of spread in isolation, their combination
ProFit: Bayesian galaxy fitting tool
Robotham, A. S. G.; Taranu, D.; Tobar, R.
2016-12-01
ProFit is a Bayesian galaxy fitting tool that uses the fast C++ image generation library libprofit (ascl:1612.003) and a flexible R interface to a large number of likelihood samplers. It offers a fully featured Bayesian interface to galaxy model fitting (also called profiling), using mostly the same standard inputs as other popular codes (e.g. GALFIT ascl:1104.010), but it is also able to use complex priors and a number of likelihoods.
Bayesian target tracking based on particle filter
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.
Variational Bayesian Approximation methods for inverse problems
Mohammad-Djafari, Ali
2012-09-01
Variational Bayesian Approximation (VBA) methods are recent tools for effective Bayesian computations. In this paper, these tools are used for inverse problems where the prior models include hidden variables and where where the estimation of the hyper parameters has also to be addressed. In particular two specific prior models (Student-t and mixture of Gaussian models) are considered and details of the algorithms are given.
Bayesian Modeling of a Human MMORPG Player
Synnaeve, Gabriel
2010-01-01
This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Bayesian Modeling of a Human MMORPG Player
Synnaeve, Gabriel; Bessière, Pierre
2011-03-01
This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Fuzzy Functional Dependencies and Bayesian Networks
Institute of Scientific and Technical Information of China (English)
LIU WeiYi(刘惟一); SONG Ning(宋宁)
2003-01-01
Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.
Lewis, Helen M.
1997-01-01
Recounts the experience of researching community history in Ivanhoe, Virginia, between 1987 and 1990. The Ivanhoe History Project involved community members in collecting photographs, memorabilia, and oral histories of their town. Subsequent published volumes won the W. D. Weatherford Award and inspired a quilt exhibit and a theatrical production.…
Philosophy and the practice of Bayesian statistics.
Gelman, Andrew; Shalizi, Cosma Rohilla
2013-02-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.
Analysing uncertainties: Towards comparing Bayesian and interval probabilities'
Blockley, David
2013-05-01
Two assumptions, commonly made in risk and reliability studies, have a long history. The first is that uncertainty is either aleatoric or epistemic. The second is that standard probability theory is sufficient to express uncertainty. The purposes of this paper are to provide a conceptual analysis of uncertainty and to compare Bayesian approaches with interval approaches with an example relevant to research on climate change. The analysis reveals that the categorisation of uncertainty as either aleatoric or epistemic is unsatisfactory for practical decision making. It is argued that uncertainty emerges from three conceptually distinctive and orthogonal attributes FIR i.e., fuzziness, incompleteness (epistemic) and randomness (aleatory). Characterisations of uncertainty, such as ambiguity, dubiety and conflict, are complex mixes of interactions in an FIR space. To manage future risks in complex systems it will be important to recognise the extent to which we 'don't know' about possible unintended and unwanted consequences or unknown-unknowns. In this way we may be more alert to unexpected hazards. The Bayesian approach is compared with an interval probability approach to show one way in which conflict due to incomplete information can be managed.
Bayesian Approach for Reliability Assessment of Sunshield Deployment on JWST
Kaminskiy, Mark P.; Evans, John W.; Gallo, Luis D.
2013-01-01
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications, for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a Bayesian approach for reliability estimation of spacecraft deployment was developed for this purpose. This approach was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the observatory's telescope and science instruments. In order to collect the prior information on deployable systems, detailed studies of "heritage information", were conducted extending over 45 years of spacecraft launches. The NASA Goddard Space Flight Center (GSFC) Spacecraft Operational Anomaly and Reporting System (SOARS) data were then used to estimate the parameters of the conjugative beta prior distribution for anomaly and failure occurrence, as the most consistent set of available data and that could be matched to launch histories. This allows for an emperical Bayesian prediction for the risk of an anomaly occurrence of the complex Sunshield deployment, with credibility limits, using prior deployment data and test information.
A Bayesian Framework for Reliability Analysis of Spacecraft Deployments
Evans, John W.; Gallo, Luis; Kaminsky, Mark
2012-01-01
Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.
BioEM: GPU-accelerated computing of Bayesian inference of electron microscopy images
Cossio, Pilar; Baruffa, Fabio; Rampp, Markus; Lindenstruth, Volker; Hummer, Gerhard
2016-01-01
In cryo-electron microscopy (EM), molecular structures are determined from large numbers of projection images of individual particles. To harness the full power of this single-molecule information, we use the Bayesian inference of EM (BioEM) formalism. By ranking structural models using posterior probabilities calculated for individual images, BioEM in principle addresses the challenge of working with highly dynamic or heterogeneous systems not easily handled in traditional EM reconstruction. However, the calculation of these posteriors for large numbers of particles and models is computationally demanding. Here we present highly parallelized, GPU-accelerated computer software that performs this task efficiently. Our flexible formulation employs CUDA, OpenMP, and MPI parallelization combined with both CPU and GPU computing. The resulting BioEM software scales nearly ideally both on pure CPU and on CPU+GPU architectures, thus enabling Bayesian analysis of tens of thousands of images in a reasonable time. The g...
Bayesian parameter estimation of core collapse supernovae using gravitational wave simulations
Edwards, Matthew C; Christensen, Nelson
2014-01-01
Using the latest numerical simulations of rotating stellar core collapse, we present a Bayesian framework to extract the physical information encoded in noisy gravitational wave signals. We fit Bayesian principal component regression models with known and unknown signal arrival times to reconstruct gravitational wave signals, and subsequently fit known astrophysical parameters on the posterior means of the principal component coefficients using a linear model. We predict the ratio of rotational kinetic energy to gravitational energy of the inner core at bounce by sampling from the posterior predictive distribution, and find that these predictions are generally very close to the true parameter values, with $90\\%$ credible intervals $\\sim 0.04$ and $\\sim 0.06$ wide for the known and unknown arrival time models respectively. Two supervised machine learning methods are implemented to classify precollapse differential rotation, and we find that these methods discriminate rapidly rotating progenitors particularly w...
Bayesian-based Wavelet Shrinkage for SAR Image Despeckling Using Cycle Spinning
Institute of Scientific and Technical Information of China (English)
ZHANG De-xiang; GAO Qing-wei; CHEN Jun-ning
2006-01-01
A novel and efficient speckle noise reduction algorithm based on Bayesian wavelet shrinkage using cycle spinning is proposed. First, the sub-band decompositions of non-logarithmically transformed SAR images are shown. Then, a Bayesian wavelet shrinkage factor is applied to the decomposed data to estimate noise-free wavelet coefficients. The method is based on the Mixture Gaussian Distributed (MGD) modeling of sub-band coefficients. Finally, multi-resolution wavelet coefficients are reconstructed by wavelet-threshold using cycle spinning. Experimental results show that the proposed despeckling algorithm is possible to achieve an excellent balance between suppresses speckle effectively and preserves as many image details and sharpness as possible. The new method indicated its higher performance than the other speckle noise reduction techniques and minimizing the effect of pseudo-Gibbs phenomena.
Bayesian inference for OPC modeling
Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.
2016-03-01
The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.
Directory of Open Access Journals (Sweden)
J. Zhu
2014-05-01
Full Text Available The Southern Hemisphere westerly winds (SHW play a crucial role in the large-scale ocean circulation and global carbon cycling. Accordingly, the reconstruction of its latitudinal position and intensity is essential for understanding global climatic fluctuations during the last glacial cycle. The southernmost part of the South American continent is of great importance for paleoclimate studies as the only continental mass intersecting a large part of the SHW belt. However, continuous proxy records back to the last Glacial are rare in southern Patagonia, owing to the Patagonian Ice Sheets expanding from the Andean area and the scarcity of continuous paleoclimate archives in extra-Andean Patagonia. Here, we present an oxygen isotope record from cellulose and purified bulk organic matter of aquatic moss shoots from the last glacial-interglacial transition preserved in the sediments of Laguna Potrok Aike (52° S, 70° W, a deep maar lake located in semi-arid, extra-Andean Patagonia. The highly significant correlation between oxygen isotope values of aquatic mosses and their host waters and the abundant well-preserved moss remains allow a high-resolution oxygen isotope reconstruction of lake water (δ18Olw for this lake. Long-term δ18Olw variations are mainly determined by δ18O changes of the source water of lake, surface air temperature and evaporative 18O enrichment. Under permafrost conditions during the Glacial, the groundwater may not be recharged by regional precipitation. The isolated groundwater could have had much less negative δ18O values than glacial precipitation. The less 18O depleted source water and prolonged lake water residence time caused by reduced interchange between in- and outflows could have resulted in the reconstructed glacial δ18Olw that was only ca. 3‰ lower than modern values. The significant two-step rise in reconstructed δ18Olw during the last deglaciation demonstrated the response of isotope composition of lake
Re-telling, Re-evaluating and Re-constructing
Directory of Open Access Journals (Sweden)
Gorana Tolja
2013-11-01
Full Text Available 'Graphic History: Essays on Graphic Novels and/as History '(2012 is a collection of 14 unique essays, edited by scholar Richard Iadonisi, that explores a variety of complex issues within the graphic novel medium as a means of historical narration. The essays address the issues of accuracy of re-counting history, history as re-constructed, and the ethics surrounding historical narration.
Bayesians versus frequentists a philosophical debate on statistical reasoning
Vallverdú, Jordi
2016-01-01
This book analyzes the origins of statistical thinking as well as its related philosophical questions, such as causality, determinism or chance. Bayesian and frequentist approaches are subjected to a historical, cognitive and epistemological analysis, making it possible to not only compare the two competing theories, but to also find a potential solution. The work pursues a naturalistic approach, proceeding from the existence of numerosity in natural environments to the existence of contemporary formulas and methodologies to heuristic pragmatism, a concept introduced in the book’s final section. This monograph will be of interest to philosophers and historians of science and students in related fields. Despite the mathematical nature of the topic, no statistical background is required, making the book a valuable read for anyone interested in the history of statistics and human cognition.
Reconstructing Contemporary Dance: An Occasion for Reflective Learning
Barr, Sherrie
2005-01-01
Reconstructions are critical in giving body to the history of dance. For student dancers, participating in reconstructions is a participation in both the legacy of dance and in dance as a form of cultural discourse. When choreographers generate movement vocabulary and improvisational parameters together with performers, the resulting collaborative…
Energy Technology Data Exchange (ETDEWEB)
Mueller, Rachel Lockridge; Macey, J. Robert; Jaekel, Martin; Wake, David B.; Boore, Jeffrey L.
2004-08-01
The evolutionary history of the largest salamander family (Plethodontidae) is characterized by extreme morphological homoplasy. Analysis of the mechanisms generating such homoplasy requires an independent, molecular phylogeny. To this end, we sequenced 24 complete mitochondrial genomes (22 plethodontids and two outgroup taxa), added data for three species from GenBank, and performed partitioned and unpartitioned Bayesian, ML, and MP phylogenetic analyses. We explored four dataset partitioning strategies to account for evolutionary process heterogeneity among genes and codon positions, all of which yielded increased model likelihoods and decreased numbers of supported nodes in the topologies (PP > 0.95) relative to the unpartitioned analysis. Our phylogenetic analyses yielded congruent trees that contrast with the traditional morphology-based taxonomy; the monophyly of three out of four major groups is rejected. Reanalysis of current hypotheses in light of these new evolutionary relationships suggests that (1) a larval life history stage re-evolved from a direct-developing ancestor multiple times, (2) there is no phylogenetic support for the ''Out of Appalachia'' hypothesis of plethodontid origins, and (3) novel scenarios must be reconstructed for the convergent evolution of projectile tongues, reduction in toe number, and specialization for defensive tail loss. Some of these novel scenarios imply morphological transformation series that proceed in the opposite direction than was previously thought. In addition, they suggest surprising evolutionary lability in traits previously interpreted to be conservative.
Cosmic expansion history from SN Ia data via information field theory
Porqueres, Natàlia; Greiner, Maksim; Böhm, Vanessa; Dorn, Sebastian; Ruiz-Lapuente, Pilar; Manrique, Alberto
2016-01-01
We present a novel inference algorithm that reconstructs the cosmic expansion history as encoded in the Hubble parameter $H(z)$ from SNe Ia data. The novelty of the approach lies in the usage of information field theory, a statistical field theory that is very well suited for the construction of optimal signal recovery algorithms. The algorithm infers non-parametrically $s(a)=\\ln(\\rho(a)/\\rho_{\\mathrm{crit}0})$, the density evolution which determines $H(z)$, without assuming an analytical form of $\\rho(a)$ but only its smoothness with the scale factor $a=(1+z)^{-1}$. The inference problem of recovering the signal $s(a)$ from the data is formulated in a fully Bayesian way. In detail, we rewrite the signal as the sum of a background cosmology and a perturbation. This allows to determine the maximum a posteriory estimate of the signal by an iterative Wiener filter method. Applying this method to the Union2.1 supernova compilation, we recover a cosmic expansion history that is fully compatible with the standard $...
Dimensionality reduction in Bayesian estimation algorithms
Directory of Open Access Journals (Sweden)
G. W. Petty
2013-03-01
Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
Bayesian modeling of flexible cognitive control.
Jiang, Jiefeng; Heller, Katherine; Egner, Tobias
2014-10-01
"Cognitive control" describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation.
Multi-Fraction Bayesian Sediment Transport Model
Directory of Open Access Journals (Sweden)
Mark L. Schmelter
2015-09-01
Full Text Available A Bayesian approach to sediment transport modeling can provide a strong basis for evaluating and propagating model uncertainty, which can be useful in transport applications. Previous work in developing and applying Bayesian sediment transport models used a single grain size fraction or characterized the transport of mixed-size sediment with a single characteristic grain size. Although this approach is common in sediment transport modeling, it precludes the possibility of capturing processes that cause mixed-size sediments to sort and, thereby, alter the grain size available for transport and the transport rates themselves. This paper extends development of a Bayesian transport model from one to k fractional dimensions. The model uses an existing transport function as its deterministic core and is applied to the dataset used to originally develop the function. The Bayesian multi-fraction model is able to infer the posterior distributions for essential model parameters and replicates predictive distributions of both bulk and fractional transport. Further, the inferred posterior distributions are used to evaluate parametric and other sources of variability in relations representing mixed-size interactions in the original model. Successful OPEN ACCESS J. Mar. Sci. Eng. 2015, 3 1067 development of the model demonstrates that Bayesian methods can be used to provide a robust and rigorous basis for quantifying uncertainty in mixed-size sediment transport. Such a method has heretofore been unavailable and allows for the propagation of uncertainty in sediment transport applications.
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.
Bayesian modeling of flexible cognitive control
Jiang, Jiefeng; Heller, Katherine; Egner, Tobias
2014-01-01
“Cognitive control” describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. PMID:24929218
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Sana, Furrukh
2016-06-01
Subsurface reservoir flow channels are characterized by high-permeability values and serve as preferred pathways for fluid propagation. Accurate estimation of their geophysical structures is thus of great importance for the oil industry. The ensemble Kalman filter (EnKF) is a widely used statistical technique for estimating subsurface reservoir model parameters. However, accurate reconstruction of the subsurface geological features with the EnKF is challenging because of the limited measurements available from the wells and the smoothing effects imposed by the \\\\ell _{2} -norm nature of its update step. A new EnKF scheme based on sparse domain representation was introduced by Sana et al. (2015) to incorporate useful prior structural information in the estimation process for efficient recovery of subsurface channels. In this paper, we extend this work in two ways: 1) investigate the effects of incorporating time-lapse seismic data on the channel reconstruction; and 2) explore a Bayesian sparse reconstruction algorithm with the potential ability to reduce the computational requirements. Numerical results suggest that the performance of the new sparse Bayesian based EnKF scheme is enhanced with the availability of seismic measurements, leading to further improvement in the recovery of flow channels structures. The sparse Bayesian approach further provides a computationally efficient framework for enforcing a sparse solution, especially with the possibility of using high sparsity rates through the inclusion of seismic data.
从沉默到说话--论《我儿子的故事》中的历史书写与重构%From Silence to Speech---On History writing and Reconstruction
Institute of Scientific and Technical Information of China (English)
姜梦
2014-01-01
非洲的历史大多是宗主国所书写的，种族隔离时期的审查制度使非洲的历史书写更加陷入失语状态。《我儿子的故事》中戈迪默改变传统叙述人称和叙述方式，让一位黑人青年作为主要叙述者，并在第三人称叙述中加入不同肤色人物的自由间接引语，消解帝国历史书写的霸权，同时以艾拉和贝比为代表的黑人女性推翻了第三世界女性父权化和殖民化的命运，书写了非洲女性的历史。帝国大写历史的消解和黑人女性主体地位的建构在解构白人中心主义的同时达到了反写南非历史的目的。%The history of Africa was mostly written by colonial powers. Censorship during apartheid almost rendered Africa aphasic. In My Son’s Story, by changing the traditional way of narrating, Gordimer employs a black narrator and adds to the narration free indirect speeches of characters’ of both races, hence subverting the power of imperial historical writing; Aila and Baby, representatives of black women, have broken the control of patriarchy and colonization, thus writing the history of African women. The deconstruction of imperial History and the construction of the subjectivity of black women have rewritten the history of South Africa while disintegrating whites’ centralism.
Bayat, Sahar; Cuggia, Marc; Kessler, Michel; Briançon, Serge; Le Beux, Pierre; Frimat, Luc
2008-01-01
Evaluation of adult candidates for kidney transplantation diverges from one centre to another. Our purpose was to assess the suitability of Bayesian method for describing the factors associated to registration on the waiting list in a French healthcare network. We have found no published paper using Bayesian method in this domain. Eight hundred and nine patients starting renal replacement therapy were included in the analysis. The data were extracted from the information system of the healthcare network. We performed conventional statistical analysis and data mining analysis using mainly Bayesian networks. The Bayesian model showed that the probability of registration on the waiting list is associated to age, cardiovascular disease, diabetes, serum albumin level, respiratory disease, physical impairment, follow-up in the department performing transplantation and past history of malignancy. These results are similar to conventional statistical method. The comparison between conventional analysis and data mining analysis showed us the contribution of the data mining method for sorting variables and having a global view of the variables' associations. Moreover theses approaches constitute an essential step toward a decisional information system for healthcare networks.
DEFF Research Database (Denmark)
In the 5 Questions book series, this volume presents a range of leading scholars in Intellectual History and the History of Ideas through their answers to a brief questionnaire. Respondents include Michael Friedman, Jacques le Goff, Hans Ulrich Gumbrecht, Jonathan Israel, Phiip Pettit, John Pocock...
DEFF Research Database (Denmark)
Christiansen, Erik
Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t.......Romerrigets historie fra Roms legendariske grundlæggelse i 753 f.v.t. til Heraklios' tronbestigelse i 610 e.v.t....
Bayesian Inference in Polling Technique: 1992 Presidential Polls.
Satake, Eiki
1994-01-01
Explores the potential utility of Bayesian statistical methods in determining the predictability of multiple polls. Compares Bayesian techniques to the classical statistical method employed by pollsters. Considers these questions in the context of the 1992 presidential elections. (HB)
Bayesian modeling of unknown diseases for biosurveillance.
Shen, Yanna; Cooper, Gregory F
2009-11-14
This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.
Learning Bayesian Networks from Correlated Data
Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola
2016-05-01
Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.
Bayesian Methods for Radiation Detection and Dosimetry
Groer, Peter G
2002-01-01
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...
Event generator tuning using Bayesian optimization
Ilten, Philip; Yang, Yunjie
2016-01-01
Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.
Learning Bayesian Networks from Correlated Data.
Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H; Perls, Thomas T; Sebastiani, Paola
2016-05-05
Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.
Bayesian Inference Methods for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand
2013-01-01
This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...
Bayesian Fusion of Multi-Band Images
Wei, Qi; Tourneret, Jean-Yves
2013-01-01
In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...
Distributed Bayesian Networks for User Modeling
DEFF Research Database (Denmark)
Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang
2006-01-01
by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...... efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....
Variational Bayesian Inference of Line Spectra
DEFF Research Database (Denmark)
Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri
2016-01-01
In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...
Hessian PDF reweighting meets the Bayesian methods
Paukkunen, Hannu
2014-01-01
We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.
Bayesian analysis of MEG visual evoked responses
Energy Technology Data Exchange (ETDEWEB)
Schmidt, D.M.; George, J.S.; Wood, C.C.
1999-04-01
The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.
Bayesian Analysis of Perceived Eye Level
Orendorff, Elaine E.; Kalesinskas, Laurynas; Palumbo, Robert T.; Albert, Mark V.
2016-01-01
To accurately perceive the world, people must efficiently combine internal beliefs and external sensory cues. We introduce a Bayesian framework that explains the role of internal balance cues and visual stimuli on perceived eye level (PEL)—a self-reported measure of elevation angle. This framework provides a single, coherent model explaining a set of experimentally observed PEL over a range of experimental conditions. Further, it provides a parsimonious explanation for the additive effect of low fidelity cues as well as the averaging effect of high fidelity cues, as also found in other Bayesian cue combination psychophysical studies. Our model accurately estimates the PEL and explains the form of previous equations used in describing PEL behavior. Most importantly, the proposed Bayesian framework for PEL is more powerful than previous behavioral modeling; it permits behavioral estimation in a wider range of cue combination and perceptual studies than models previously reported. PMID:28018204
Dynamic Bayesian Combination of Multiple Imperfect Classifiers
Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon
2012-01-01
Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...
Kent, A
1998-01-01
There are good motivations for considering some type of quantum histories formalism. Several possible formalisms are known, defined by different definitions of event and by different selection criteria for sets of histories. These formalisms have a natural interpretation, according to which nature somehow chooses one set of histories from among those allowed, and then randomly chooses to realise one history from that set; other interpretations are possible, but their scientific implications are essentially the same. The selection criteria proposed to date are reasonably natural, and certainly raise new questions. For example, the validity of ordering inferences which we normally take for granted --- such as that a particle in one region is necessarily in a larger region containing it --- depends on whether or not our history respects the criterion of ordered consistency, or merely consistency. However, the known selection criteria, including consistency and medium decoherence, are very weak. It is not possibl...
A Bayesian Super-Resolution Approach to Demosaicing of Blurred Images
Directory of Open Access Journals (Sweden)
Molina Rafael
2006-01-01
Full Text Available Most of the available digital color cameras use a single image sensor with a color filter array (CFA in acquiring an image. In order to produce a visible color image, a demosaicing process must be applied, which produces undesirable artifacts. An additional problem appears when the observed color image is also blurred. This paper addresses the problem of deconvolving color images observed with a single coupled charged device (CCD from the super-resolution point of view. Utilizing the Bayesian paradigm, an estimate of the reconstructed image and the model parameters is generated. The proposed method is tested on real images.
A population-based Bayesian approach to the minimal model of glucose and insulin homeostasis
DEFF Research Database (Denmark)
Andersen, Kim Emil; Højbjerre, Malene
2005-01-01
for a whole population. Traditionally it has been analysed in a deterministic set-up with only error terms on the measurements. In this work we adopt a Bayesian graphical model to describe the coupled minimal model that accounts for both measurement and process variability, and the model is extended......-posed estimation problem, where the reconstruction most often has been done by non-linear least squares techniques separately for each entity. The minmal model was originally specified for a single individual and does not combine several individuals with the advantage of estimating the metabolic portrait...
Structure learning for Bayesian networks as models of biological networks.
Larjo, Antti; Shmulevich, Ilya; Lähdesmäki, Harri
2013-01-01
Bayesian networks are probabilistic graphical models suitable for modeling several kinds of biological systems. In many cases, the structure of a Bayesian network represents causal molecular mechanisms or statistical associations of the underlying system. Bayesian networks have been applied, for example, for inferring the structure of many biological networks from experimental data. We present some recent progress in learning the structure of static and dynamic Bayesian networks from data.
Breast reconstruction after mastectomy
Directory of Open Access Journals (Sweden)
Daniel eSchmauss
2016-01-01
Full Text Available Breast cancer is the leading cause of cancer death in women worldwide. Its surgical approach has become less and less mutilating in the last decades. However, the overall number of breast reconstructions has significantly increased lately. Nowadays breast reconstruction should be individualized at its best, first of all taking into consideration oncological aspects of the tumor, neo-/adjuvant treatment and genetic predisposition, but also its timing (immediate versus delayed breast reconstruction, as well as the patient’s condition and wish. This article gives an overview over the various possibilities of breast reconstruction, including implant- and expander-based reconstruction, flap-based reconstruction (vascularized autologous tissue, the combination of implant and flap, reconstruction using non-vascularized autologous fat, as well as refinement surgery after breast reconstruction.
A Bayesian Analysis of Spectral ARMA Model
Directory of Open Access Journals (Sweden)
Manoel I. Silvestre Bezerra
2012-01-01
Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.
Length Scales in Bayesian Automatic Adaptive Quadrature
Directory of Open Access Journals (Sweden)
Adam Gh.
2016-01-01
Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.
Bayesian estimation and tracking a practical guide
Haug, Anton J
2012-01-01
A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation
Bayesian long branch attraction bias and corrections.
Susko, Edward
2015-03-01
Previous work on the star-tree paradox has shown that Bayesian methods suffer from a long branch attraction bias. That work is extended to settings involving more taxa and partially resolved trees. The long branch attraction bias is confirmed to arise more broadly and an additional source of bias is found. A by-product of the analysis is methods that correct for biases toward particular topologies. The corrections can be easily calculated using existing Bayesian software. Posterior support for a set of two or more trees can thus be supplemented with corrected versions to cross-check or replace results. Simulations show the corrections to be highly effective.
A Bayesian Concept Learning Approach to Crowdsourcing
DEFF Research Database (Denmark)
Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;
2011-01-01
techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....
Bayesian Optimisation Algorithm for Nurse Scheduling
Li, Jingpeng
2008-01-01
Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.
Maximum-entropy weak lens reconstruction improved methods and application to data
Marshall, P J; Gull, S F; Bridle, S L
2002-01-01
We develop the maximum-entropy weak shear mass reconstruction method presented in earlier papers by taking each background galaxy image shape as an independent estimator of the reduced shear field and incorporating an intrinsic smoothness into the reconstruction. The characteristic length scale of this smoothing is determined by Bayesian methods. Within this algorithm the uncertainties due to the intrinsic distribution of galaxy shapes are carried through to the final mass reconstruction, and the mass within arbitrarily shaped apertures can be calculated with corresponding uncertainties. We apply this method to two clusters taken from N-body simulations using mock observations corresponding to Keck LRIS and mosaiced HST WFPC2 fields. We demonstrate that the Bayesian choice of smoothing length is sensible and that masses within apertures (including one on a filamentary structure) are reliable. We apply the method to data taken on the cluster MS1054-03 using the Keck LRIS (Clowe et al. 2000) and HST (Hoekstra e...
Reoperative midface reconstruction.
Acero, Julio; García, Eloy
2011-02-01
Reoperative reconstruction of the midface is a challenging issue because of the complexity of this region and the severity of the aesthetic and functional sequela related to the absence or failure of a primary reconstruction. The different situations that can lead to the indication of a reoperative reconstructive procedure after previous oncologic ablative procedures in the midface are reviewed. Surgical techniques, anatomic problems, and limitations affecting the reoperative reconstruction in this region of the head and neck are discussed.
Bayesian Just-So Stories in Psychology and Neuroscience
Bowers, Jeffrey S.; Davis, Colin J.
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…