Bayesian Source Separation and Localization
Knuth, K H
1998-01-01
The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...
Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin H.
2005-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In h s tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea of informed source separation, where the algorithm design incorporates relevant information about the specific problem. This approach promises to enable researchers to design their own high-quality algorithms that are specifically tailored to the problem at hand.
Low Complexity Bayesian Single Channel Source Separation
DEFF Research Database (Denmark)
Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole
2004-01-01
. Simulations with separation of a male and a female speaker using priors trained on the same speakers show comparable performance with the blind separation approach of G.-J. Jang and T.-W. Lee (see NIPS, vol.15, 2003) with a SNR improvement of 4.9 dB for both the male and female speaker. Mixing coefficients...
Bayesian blind source separation for data with network structure.
Illner, Katrin; Fuchs, Christiane; Theis, Fabian J
2014-11-01
In biology, more and more information about the interactions in regulatory systems becomes accessible, and this often leads to prior knowledge for recent data interpretations. In this work we focus on multivariate signaling data, where the structure of the data is induced by a known regulatory network. To extract signals of interest we assume a blind source separation (BSS) model, and we capture the structure of the source signals in terms of a Bayesian network. To keep the parameter space small, we consider stationary signals, and we introduce the new algorithm emGrade, where model parameters and source signals are estimated using expectation maximization. For network data, we find an improved estimation performance compared to other BSS algorithms, and the flexible Bayesian modeling enables us to deal with repeated and missing observation values. The main advantage of our method is the statistically interpretable likelihood, and we can use model selection criteria to determine the (in general unknown) number of source signals or decide between different given networks. In simulations we demonstrate the recovery of the source signals dependent on the graph structure and the dimensionality of the data.
Convergent Bayesian formulations of blind source separation and electromagnetic source estimation
Knuth, Kevin H
2015-01-01
We consider two areas of research that have been developing in parallel over the last decade: blind source separation (BSS) and electromagnetic source estimation (ESE). BSS deals with the recovery of source signals when only mixtures of signals can be obtained from an array of detectors and the only prior knowledge consists of some information about the nature of the source signals. On the other hand, ESE utilizes knowledge of the electromagnetic forward problem to assign source signals to their respective generators, while information about the signals themselves is typically ignored. We demonstrate that these two techniques can be derived from the same starting point using the Bayesian formalism. This suggests a means by which new algorithms can be developed that utilize as much relevant information as possible. We also briefly mention some preliminary work that supports the value of integrating information used by these two techniques and review the kinds of information that may be useful in addressing the...
Bayesian Source Separation Applied to Identifying Complex Organic Molecules in Space
Knuth, Kevin H; Choinsky, Joshua; Maunu, Haley A; Carbon, Duane F
2014-01-01
Emission from a class of benzene-based molecules known as Polycyclic Aromatic Hydrocarbons (PAHs) dominates the infrared spectrum of star-forming regions. The observed emission appears to arise from the combined emission of numerous PAH species, each with its unique spectrum. Linear superposition of the PAH spectra identifies this problem as a source separation problem. It is, however, of a formidable class of source separation problems given that different PAH sources potentially number in the hundreds, even thousands, and there is only one measured spectral signal for a given astrophysical site. Fortunately, the source spectra of the PAHs are known, but the signal is also contaminated by other spectral sources. We describe our ongoing work in developing Bayesian source separation techniques relying on nested sampling in conjunction with an ON/OFF mechanism enabling simultaneous estimation of the probability that a particular PAH species is present and its contribution to the spectrum.
Lucka, Felix; Pursiainen, Sampsa; Burger, Martin; Wolters, Carsten H
2012-07-16
The estimation of the activity-related ion currents by measuring the induced electromagnetic fields at the head surface is a challenging and severely ill-posed inverse problem. This is especially true in the recovery of brain networks involving deep-lying sources by means of EEG/MEG recordings which is still a challenging task for any inverse method. Recently, hierarchical Bayesian modeling (HBM) emerged as a unifying framework for current density reconstruction (CDR) approaches comprising most established methods as well as offering promising new methods. Our work examines the performance of fully-Bayesian inference methods for HBM for source configurations consisting of few, focal sources when used with realistic, high-resolution finite element (FE) head models. The main foci of interest are the correct depth localization, a well-known source of systematic error of many CDR methods, and the separation of single sources in multiple-source scenarios. Both aspects are very important in the analysis of neurophysiological data and in clinical applications. For these tasks, HBM provides a promising framework and is able to improve upon established CDR methods such as minimum norm estimation (MNE) or sLORETA in many aspects. For challenging multiple-source scenarios where the established methods show crucial errors, promising results are attained. Additionally, we introduce Wasserstein distances as performance measures for the validation of inverse methods in complex source scenarios.
Single channel signal component separation using Bayesian estimation
Institute of Scientific and Technical Information of China (English)
Cai Quanwei; Wei Ping; Xiao Xianci
2007-01-01
A Bayesian estimation method to separate multicomponent signals with single channel observation is presented in this paper. By using the basis function projection, the component separation becomes a problem of limited parameter estimation. Then, a Bayesian model for estimating parameters is set up. The reversible jump MCMC (Monte Carlo Markov Chain) algorithmis adopted to perform the Bayesian computation. The method can jointly estimate the parameters of each component and the component number. Simulation results demonstrate that the method has low SNR threshold and better performance.
A Bayesian method for microseismic source inversion
Pugh, D. J.; White, R. S.; Christie, P. A. F.
2016-08-01
Earthquake source inversion is highly dependent on location determination and velocity models. Uncertainties in both the model parameters and the observations need to be rigorously incorporated into an inversion approach. Here, we show a probabilistic Bayesian method that allows formal inclusion of the uncertainties in the moment tensor inversion. This method allows the combination of different sets of far-field observations, such as P-wave and S-wave polarities and amplitude ratios, into one inversion. Additional observations can be included by deriving a suitable likelihood function from the uncertainties. This inversion produces samples from the source posterior probability distribution, including a best-fitting solution for the source mechanism and associated probability. The inversion can be constrained to the double-couple space or allowed to explore the gamut of moment tensor solutions, allowing volumetric and other non-double-couple components. The posterior probability of the double-couple and full moment tensor source models can be evaluated from the Bayesian evidence, using samples from the likelihood distributions for the two source models, producing an estimate of whether or not a source is double-couple. Such an approach is ideally suited to microseismic studies where there are many sources of uncertainty and it is often difficult to produce reliability estimates of the source mechanism, although this can be true of many other cases. Using full-waveform synthetic seismograms, we also show the effects of noise, location, network distribution and velocity model uncertainty on the source probability density function. The noise has the largest effect on the results, especially as it can affect other parts of the event processing. This uncertainty can lead to erroneous non-double-couple source probability distributions, even when no other uncertainties exist. Although including amplitude ratios can improve the constraint on the source probability
Convolutive Blind Source Separation Methods
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik
2008-01-01
During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources fro...
Bayesian Separation of Non-Stationary Mixtures of Dependent Gaus
National Aeronautics and Space Administration — In this work, we propose a novel approach to perform Dependent Component Analysis (DCA). DCA can be thought as the separation of latent, dependent sources from their...
A Bayesian analysis of regularised source inversions in gravitational lensing
Suyu, S H; Hobson, M P; Marshall, P J
2006-01-01
Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...
Blind source separation dependent component analysis
Xiang, Yong; Yang, Zuyuan
2015-01-01
This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.
Bayesian Source Attribution of Salmonellosis in South Australia.
Glass, K; Fearnley, E; Hocking, H; Raupach, J; Veitch, M; Ford, L; Kirk, M D
2016-03-01
Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia.
Fast Bayesian optimal experimental design for seismic source inversion
Long, Quan
2015-07-01
We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and th
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.
Transform domain steganography with blind source separation
Jouny, Ismail
2015-05-01
This paper applies blind source separation or independent component analysis for images that may contain mixtures of text, audio, or other images for steganography purposes. The paper focuses on separating mixtures in the transform domain such as Fourier domain or the Wavelet domain. The study addresses the effectiveness of steganography when using linear mixtures of multimedia components and the ability of standard blind sources separation techniques to discern hidden multimedia messages. Mixing in the space, frequency, and wavelet (scale) domains is compared. Effectiveness is measured using mean square error rate between original and recovered images.
Source number estimation and separation algorithms of underdetermined blind separation
Institute of Scientific and Technical Information of China (English)
YANG ZuYuan; TAN BeiHai; ZHOU GuoXu; ZHANG JinLong
2008-01-01
Recently,sparse component analysis (SCA) has become a hot spot in BSS research.Instead of independent component analysis (ICA),SCA can be used to solve underdetermined mixture efficiently.Two-step approach (TSA) is one of the typical methods to solve SCA based BSS problems.It estimates the mixing matrix before the separation of the sources.K-means clustering is often used to estimate the mixing matrix.It relies on the prior knowledge of the source number strongly.However,the estimation of the source number is an obstacle.In this paper,a fuzzy clustering method is proposed to estimate the source number and mixing matrix simultaneously.After that,the sources are recovered by the shortest path method (SPM).Simulations show the availability and robustness of the proposed method.
Blind source separation theory and applications
Yu, Xianchuan; Xu, Jindong
2013-01-01
A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies The book presents an overview of Blind Source Separation, a relatively new signal processing method. Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers
Grading learning for blind source separation
Institute of Scientific and Technical Information of China (English)
张贤达; 朱孝龙; 保铮
2003-01-01
By generalizing the learning rate parameter to a learning rate matrix, this paper proposes agrading learning algorithm for blind source separation. The whole learning process is divided into threestages: initial stage, capturing stage and tracking stage. In different stages, different learning rates areused for each output component, which is determined by its dependency on other output components. Itis shown that the grading learning algorithm is equivariant and can keep the separating matrix from be-coming singular. Simulations show that the proposed algorithm can achieve faster convergence, bettersteady-state performance and higher numerical robustness, as compared with the existing algorithmsusing fixed, time-descending and adaptive learning rates.
Blind source separation problem in GPS time series
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition
Directory of Open Access Journals (Sweden)
Zhujie Chu
2016-02-01
Full Text Available Municipal household solid waste (MHSW has become a serious problem in China over the course of the last two decades, resulting in significant side effects to the environment. Therefore, effective management of MHSW has attracted wide attention from both researchers and practitioners. Separate collection, the first and crucial step to solve the MHSW problem, however, has not been thoroughly studied to date. An empirical survey has been conducted among 387 households in Harbin, China in this study. We use Bayesian Belief Networks model to determine the influencing factors on separate collection. Four types of factors are identified, including political, economic, social cultural and technological based on the PEST (political, economic, social and technological analytical method. In addition, we further analyze the influential power of different factors, based on the network structure and probability changes obtained by Netica software. Results indicate that technological dimension has the greatest impact on MHSW separate collection, followed by the political dimension and economic dimension; social cultural dimension impacts MHSW the least.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Le; Timbie, Peter T. [Department of Physics, University of Wisconsin, Madison, WI 53706 (United States); Bunn, Emory F. [Physics Department, University of Richmond, Richmond, VA 23173 (United States); Karakci, Ata; Korotkov, Andrei; Tucker, Gregory S. [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912 (United States); Sutter, P. M. [Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Wandelt, Benjamin D., E-mail: lzhang263@wisc.edu [Department of Physics, University of Illinois at Urbana-Champaign, 1110 W Green Street, Urbana, IL 61801 (United States)
2016-01-15
In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approach can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.
Source detection in astronomical images by Bayesian model comparison
Frean, Marcus; Friedlander, Anna; Johnston-Hollitt, Melanie; Hollitt, Christopher
2014-12-01
The next generation of radio telescopes will generate exabytes of data on hundreds of millions of objects, making automated methods for the detection of astronomical objects ("sources") essential. Of particular importance are faint, diffuse objects embedded in noise. There is a pressing need for source finding software that identifies these sources, involves little manual tuning, yet is tractable to calculate. We first give a novel image discretisation method that incorporates uncertainty about how an image should be discretised. We then propose a hierarchical prior for astronomical images, which leads to a Bayes factor indicating how well a given region conforms to a model of source that is exceptionally unconstrained, compared to a model of background. This enables the efficient localisation of regions that are "suspiciously different" from the background distribution, so our method looks not for brightness but for anomalous distributions of intensity, which is much more general. The model of background can be iteratively improved by removing the influence on it of sources as they are discovered. The approach is evaluated by identifying sources in real and simulated data, and performs well on these measures: the Bayes factor is maximized at most real objects, while returning only a moderate number of false positives. In comparison to a catalogue constructed by widely-used source detection software with manual post-processing by an astronomer, our method found a number of dim sources that were missing from the "ground truth" catalogue.
Gradient Flow Convolutive Blind Source Separation
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Nielsen, Chinton Møller
2004-01-01
Experiments have shown that the performance of instantaneous gradient flow beamforming by Cauwenberghs et al. is reduced significantly in reverberant conditions. By expanding the gradient flow principle to convolutive mixtures, separation in a reverberant environment is possible. By use of a circ......Experiments have shown that the performance of instantaneous gradient flow beamforming by Cauwenberghs et al. is reduced significantly in reverberant conditions. By expanding the gradient flow principle to convolutive mixtures, separation in a reverberant environment is possible. By use...
A Bayesian Approach for Localization of Acoustic Emission Source in Plate-Like Structures
Directory of Open Access Journals (Sweden)
Gang Yan
2015-01-01
Full Text Available This paper presents a Bayesian approach for localizing acoustic emission (AE source in plate-like structures with consideration of uncertainties from modeling error and measurement noise. A PZT sensor network is deployed to monitor and acquire AE wave signals released by possible damage. By using continuous wavelet transform (CWT, the time-of-flight (TOF information of the AE wave signals is extracted and measured. With a theoretical TOF model, a Bayesian parameter identification procedure is developed to obtain the AE source location and the wave velocity at a specific frequency simultaneously and meanwhile quantify their uncertainties. It is based on Bayes’ theorem that the posterior distributions of the parameters about the AE source location and the wave velocity are obtained by relating their priors and the likelihood of the measured time difference data. A Markov chain Monte Carlo (MCMC algorithm is employed to draw samples to approximate the posteriors. Also, a data fusion scheme is performed to fuse results identified at multiple frequencies to increase accuracy and reduce uncertainty of the final localization results. Experimental studies on a stiffened aluminum panel with simulated AE events by pensile lead breaks (PLBs are conducted to validate the proposed Bayesian AE source localization approach.
Inference of emission rates from multiple sources using Bayesian probability theory.
Yee, Eugene; Flesch, Thomas K
2010-03-01
The determination of atmospheric emission rates from multiple sources using inversion (regularized least-squares or best-fit technique) is known to be very susceptible to measurement and model errors in the problem, rendering the solution unusable. In this paper, a new perspective is offered for this problem: namely, it is argued that the problem should be addressed as one of inference rather than inversion. Towards this objective, Bayesian probability theory is used to estimate the emission rates from multiple sources. The posterior probability distribution for the emission rates is derived, accounting fully for the measurement errors in the concentration data and the model errors in the dispersion model used to interpret the data. The Bayesian inferential methodology for emission rate recovery is validated against real dispersion data, obtained from a field experiment involving various source-sensor geometries (scenarios) consisting of four synthetic area sources and eight concentration sensors. The recovery of discrete emission rates from three different scenarios obtained using Bayesian inference and singular value decomposition inversion are compared and contrasted.
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2009-01-01
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface......, and electrode positions. We first present a hierarchical Bayesian framework for EEG source localization that jointly performs source and forward model reconstruction (SOFOMORE). Secondly, we evaluate the SOFOMORE model by comparison with source reconstruction methods that use fixed forward models. Simulated...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....
Block-sparse beamforming for spatially extended sources in a Bayesian formulation
DEFF Research Database (Denmark)
Xenaki, Angeliki; Fernandez Grande, Efren; Gerstoft, Peter
2016-01-01
sources, but cannot capture spatially extended sources. The DOA estimation problem is formulated in a Bayesian framework where regularization is imposed through prior information on the source spatial distribution which is then reconstructed as the maximum a posteriori estimate. A composite prior......Direction-of-arrival (DOA) estimation refers to the localization of sound sources on an angular grid from noisy measurements of the associated wavefield with an array of sensors. For accurate localization, the number of angular look-directions is much larger than the number of sensors, hence...... is introduced, which simultaneously promotes a piecewise constant profile and sparsity in the solution. Simulations and experimental measurements show that this choice of regularization provides high-resolution DOA estimation in a general framework, i.e., in the presence of spatially extended sources....
Blind separation of incoherent and spatially disjoint sound sources
Dong, Bin; Antoni, Jérôme; Pereira, Antonio; Kellermann, Walter
2016-11-01
Blind separation of sound sources aims at reconstructing the individual sources which contribute to the overall radiation of an acoustical field. The challenge is to reach this goal using distant measurements when all sources are operating concurrently. The working assumption is usually that the sources of interest are incoherent - i.e. statistically orthogonal - so that their separation can be approached by decorrelating a set of simultaneous measurements, which amounts to diagonalizing the cross-spectral matrix. Principal Component Analysis (PCA) is traditionally used to this end. This paper reports two new findings in this context. First, a sufficient condition is established under which "virtual" sources returned by PCA coincide with true sources; it stipulates that the sources of interest should be not only incoherent but also spatially orthogonal. A particular case of this instance is met by spatially disjoint sources - i.e. with non-overlapping support sets. Second, based on this finding, a criterion that enforces both statistical and spatial orthogonality is proposed to blindly separate incoherent sound sources which radiate from disjoint domains. This criterion can be easily incorporated into acoustic imaging algorithms such as beamforming or acoustical holography to identify sound sources of different origins. The proposed methodology is validated on laboratory experiments. In particular, the separation of aeroacoustic sources is demonstrated in a wind tunnel.
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.
Zanini, Andrea; Woodbury, Allan D
2016-01-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
Energy Technology Data Exchange (ETDEWEB)
Miller, Erin A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Robinson, Sean M. [Pacific Northwest National Lab. (PNNL), Seattle, WA (United States); Anderson, Kevin K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCall, Jonathon D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prinke, Amanda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Seifert, Carolyn E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
Energy Technology Data Exchange (ETDEWEB)
Miller, Erin A. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States); Robinson, Sean M. [Pacific Northwest National Laboratory, Seattle, WA 98109 (United States); Anderson, Kevin K.; McCall, Jonathon D.; Prinke, Amanda M.; Webster, Jennifer B.; Seifert, Carolyn E. [Pacific Northwest National Laboratory, Richland, WA 99352 (United States)
2015-06-01
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search.
Novel blind source separation algorithm using Gaussian mixture density function
Institute of Scientific and Technical Information of China (English)
孔薇; 杨杰; 周越
2004-01-01
The blind source separation (BSS) is an important task for numerous applications in signal processing, communications and array processing. But for many complex sources blind separation algorithms are not efficient because the probability distribution of the sources cannot be estimated accurately. So in this paper, to justify the ME(maximum enteropy) approach, the relation between the ME and the MMI(minimum mutual information) is elucidated first. Then a novel algorithm that uses Gaussian mixture density to approximate the probability distribution of the sources is presented based on the ME approach. The experiment of the BSS of ship-radiated noise demonstrates that the proposed algorithm is valid and efficient.
Single channel blind source separation based on ICA feature extraction
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A new technique is proposed to solve the blind source separation (BSS) given only a single channel observation. The basis functions and the density of the coefficients of source signals learned by ICA are used as the prior knowledge. Based on the learned prior information the learning rules of single channel BSS are presented by maximizing the joint log likelihood of the mixed sources to obtain source signals from single observation,in which the posterior density of the given measurements is maximized. The experimental results exhibit a successful separation performance for mixtures of speech and music signals.
Precise Measurement of Separation Between Two Spherical Source Masses
Institute of Scientific and Technical Information of China (English)
陈德才; 罗俊; 胡忠坤; 赵亮
2004-01-01
A driving gauge method is performed to determine the separation between two spherical source masses in the measurement of Newtonian gravitational constant G. The experimental result shows that the uncertainty of determining the separation is about 0.35μm, which would contribute an uncertainty of 7.3ppm to the value of G.
Blind source separation based on generalized gaussian model
Institute of Scientific and Technical Information of China (English)
YANG Bin; KONG Wei; ZHOU Yue
2007-01-01
Since in most blind source separation (BSS) algorithms the estimations of probability density function (pdf) of sources are fixed or can only switch between one sup-Gaussian and other sub-Gaussian model,they may not be efficient to separate sources with different distributions. So to solve the problem of pdf mismatch and the separation of hybrid mixture in BSS, the generalized Gaussian model (GGM) is introduced to model the pdf of the sources since it can provide a general structure of univariate distributions. Its great advantage is that only one parameter needs to be determined in modeling the pdf of different sources, so it is less complex than Gaussian mixture model. By using maximum likelihood (ML) approach, the convergence of the proposed algorithm is improved. The computer simulations show that it is more efficient and valid than conventional methods with fixed pdf estimation.
Application of evidence theory in information fusion of multiple sources in bayesian analysis
Institute of Scientific and Technical Information of China (English)
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
On merging rainfall data from diverse sources using a Bayesian approach
Bhattacharya, Biswa; Tarekegn, Tegegne
2014-05-01
Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Blind separation of image sources via adaptive dictionary learning.
Abolghasemi, Vahid; Ferdowsi, Saideh; Sanei, Saeid
2012-06-01
Sparsity has been shown to be very useful in source separation of multichannel observations. However, in most cases, the sources of interest are not sparse in their current domain and one needs to sparsify them using a known transform or dictionary. If such a priori about the underlying sparse domain of the sources is not available, then the current algorithms will fail to successfully recover the sources. In this paper, we address this problem and attempt to give a solution via fusing the dictionary learning into the source separation. We first define a cost function based on this idea and propose an extension of the denoising method in the work of Elad and Aharon to minimize it. Due to impracticality of such direct extension, we then propose a feasible approach. In the proposed hierarchical method, a local dictionary is adaptively learned for each source along with separation. This process improves the quality of source separation even in noisy situations. In another part of this paper, we explore the possibility of adding global priors to the proposed method. The results of our experiments are promising and confirm the strength of the proposed approach.
Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)
2012-09-15
The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)
Common source-multiple load vs. separate source-individual load photovoltaic system
Appelbaum, Joseph
1989-01-01
A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.
Extended Nonnegative Tensor Factorisation Models for Musical Sound Source Separation
Directory of Open Access Journals (Sweden)
Derry FitzGerald
2008-01-01
Full Text Available Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.
Bayesian statistics applied to the location of the source of explosions at Stromboli Volcano, Italy
Saccorotti, G.; Chouet, B.; Martini, M.; Scarpa, R.
1998-01-01
We present a method for determining the location and spatial extent of the source of explosions at Stromboli Volcano, Italy, based on a Bayesian inversion of the slowness vector derived from frequency-slowness analyses of array data. The method searches for source locations that minimize the error between the expected and observed slowness vectors. For a given set of model parameters, the conditional probability density function of slowness vectors is approximated by a Gaussian distribution of expected errors. The method is tested with synthetics using a five-layer velocity model derived for the north flank of Stromboli and a smoothed velocity model derived from a power-law approximation of the layered structure. Application to data from Stromboli allows for a detailed examination of uncertainties in source location due to experimental errors and incomplete knowledge of the Earth model. Although the solutions are not constrained in the radial direction, excellent resolution is achieved in both transverse and depth directions. Under the assumption that the horizontal extent of the source does not exceed the crater dimension, the 90% confidence region in the estimate of the explosive source location corresponds to a small volume extending from a depth of about 100 m to a maximum depth of about 300 m beneath the active vents, with a maximum likelihood source region located in the 120- to 180-m-depth interval.
Denoising Using Blind Source Separation for Pyroelectric Sensors
Directory of Open Access Journals (Sweden)
Huez Regis
2001-01-01
Full Text Available This paper deals with a process of denoising based on a Blind Source Separation (BSS method. This technique is inserted in an experimental device of nondestructive testing. Its excitation is a laser beam and its detectors are pyroelectric sensors. The latter are sensitive to the temperature. As they are also piezoelectric, they are particularly sensitive to the environmental noise. Therefore, it is necessary to denoise them. With this aim in view, a technique of blind source separation is implemented. One source corresponds to the incidental beam and the other sources are various noise. A judicious experimental device was designed in the laboratory. It fits to the requirements of the BSS technique, and it allows indeed a restoration of the incident signal.
Bayesian CMB foreground separation with a correlated log-normal model
Oppermann, Niels
2014-01-01
The extraction of foreground and CMB maps from multi-frequency observations relies mostly on the different frequency behavior of the different components. Existing Bayesian methods additionally make use of a Gaussian prior for the CMB whose correlation structure is described by an unknown angular power spectrum. We argue for the natural extension of this by using non-trivial priors also for the foreground components. Focusing on diffuse Galactic foregrounds, we propose a log-normal model including unknown spatial correlations within each component and cross-correlations between the different foreground components. We present case studies at low resolution that demonstrate the superior performance of this model when compared to an analysis with flat priors for all components.
Contending non-double-couple source components with hierarchical Bayesian moment tensor inversion
Mustac, M.; Tkalcic, H.
2015-12-01
Seismic moment tensors can aid the discrimination between earthquakes and explosions. However, the isotropic component can be found in a large number of earthquakes simply as a consequence of earthquake location, poorly modeled structure or noise in the data. Recently, we have developed a method for moment tensor inversion, capable of retrieving parameter uncertainties together with their optimal values, using probabilistic Bayesian inference. It has been applied to data from a complex volcanic environment in the Long Valley Caldera (LVC), California, and confirmed a large isotropic source component. We now extend the application to two different environments where the existence of non-double-couple source components is likely. The method includes notable treatment of the noise, utilizing pre-event noise to estimate the noise covariance matrix. This is extended throughout the inversion, where noise variance is a "hyperparameter" that determines the level of data fit. On top of that, different noise parameters for each station are used as weights, naturally penalizing stations with noisy data. In the LVC case, this means increasing the amount of information from two stations at moderate distances, which results in a 1 km deeper estimate for the source location. This causes a change in the non-double-couple components in the inversions assuming a simple diagonal or exponential covariance matrix, but not in the inversion assuming a more complicated, attenuated cosine covariance matrix. Combining a sophisticated noise treatment with a powerful Bayesian inversion technique can give meaningful uncertainty estimates for long-period (20-50 s) data, provided an appropriate regional structure model.
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
Score-Informed Source Separation for Multichannel Orchestral Recordings
Directory of Open Access Journals (Sweden)
Marius Miron
2016-01-01
Full Text Available This paper proposes a system for score-informed audio source separation for multichannel orchestral recordings. The orchestral music repertoire relies on the existence of scores. Thus, a reliable separation requires a good alignment of the score with the audio of the performance. To that extent, automatic score alignment methods are reliable when allowing a tolerance window around the actual onset and offset. Moreover, several factors increase the difficulty of our task: a high reverberant image, large ensembles having rich polyphony, and a large variety of instruments recorded within a distant-microphone setup. To solve these problems, we design context-specific methods such as the refinement of score-following output in order to obtain a more precise alignment. Moreover, we extend a close-microphone separation framework to deal with the distant-microphone orchestral recordings. Then, we propose the first open evaluation dataset in this musical context, including annotations of the notes played by multiple instruments from an orchestral ensemble. The evaluation aims at analyzing the interactions of important parts of the separation framework on the quality of separation. Results show that we are able to align the original score with the audio of the performance and separate the sources corresponding to the instrument sections.
Asymptotics of Bayesian error probability and source super-localization in three dimensions.
Prasad, S
2014-06-30
We present an asymptotic analysis of the minimum probability of error (MPE) in inferring the correct hypothesis in a Bayesian multi-hypothesis testing (MHT) formalism using many pixels of data that are corrupted by signal dependent shot noise, sensor read noise, and background illumination. We perform our analysis for a variety of combined noise and background statistics, including a pseudo-Gaussian distribution that can be employed to treat approximately the photon-counting statistics of signal and background as well as purely Gaussian sensor read-out noise and more general, exponentially peaked distributions. We subsequently evaluate both the exact and asymptotic MPE expressions for the problem of three-dimensional (3D) point source localization. We focus specifically on a recently proposed rotating-PSF imager and compare, using the MPE metric, its 3D localization performance with that of conventional and astigmatic imagers in the presence of background and sensor-noise fluctuations.
DEFF Research Database (Denmark)
Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John
compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, ii) measurement uncertainty, and iii...... a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners......) uncertain source zone and transport parameters. The method generates multiple equally likely realisations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realisations are generated by co-simulating the hydraulic conductivity...
Real-time realizations of the Bayesian Infrasonic Source Localization Method
Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.
2015-12-01
The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.
How Many Separable Sources? Model Selection In Independent Components Analysis
DEFF Research Database (Denmark)
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysi...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.......Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)
2013-10-15
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Bayesian Estimation of Fugitive Methane Point Source Emission Rates from a Single Downwind High-Frequency Gas Sensor With the tremendous advances in onshore oil and gas exploration and production (E&P) capability comes the realization that new tools are needed to support env...
Blind speech source separation via nonlinear time-frequency masking
Institute of Scientific and Technical Information of China (English)
XU Shun; CHEN Shaorong; LIU Yulin
2008-01-01
Aim at the underdetermined convolutive mixture model, a blind speech source separation method based on nonlinear time-frequency masking was proposed, where the approximate W-disjoint orthogonality (W-DO) property among independent speech signals in time-frequency domain is utilized. In this method, the observation mixture signal from multimicrophones is normalized to be independent of frequency in the time-frequency domain at first, then the dynamic clustering algorithm is adopted to obtain the active source information in each time-frequency slot, a nonlinear function via deflection angle from the cluster center is selected for time-frequency masking, finally the blind separation of mixture speech signals can be achieved by inverse STFT (short-time Fourier transformation). This method can not only solve the problem of frequency permutation which may be met in most classic frequency-domain blind separation techniques, but also suppress the spatial direction diffusion of the separation matrix. The simulation results demonstrate that the proposed separation method is better than the typical BLUES method, the signal-noise-ratio gain (SNRG) increases 1.58 dB averagely.
FREQUENCY OVERLAPPED SIGNAL IDENTIFICATION USING BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
WANG Junfeng; SHI Tielin; HE Lingsong; YANG Shuzi
2006-01-01
The concepts, principles and usages of principal component analysis (PCA) and independent component analysis (ICA) are interpreted. Then the algorithm and methodology of ICA-based blind source separation (BSS), in which the pre-whitened based on PCA for observed signals is used, are researched. Aiming at the mixture signals, whose frequency components are overlapped by each other, a simulation of BSS to separate this type of mixture signals by using theory and approach of BSS has been done. The result shows that the BSS has some advantages what the traditional methodology of frequency analysis has not.
Directory of Open Access Journals (Sweden)
Rasheda Arman Chowdhury
Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.
Abban, B.; (Thanos) Papanicolaou, A. N.; Cowles, M. K.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schilling, K.; Schnoebelen, D.
2016-06-01
An enhanced revision of the Fox and Papanicolaou (hereafter referred to as "F-P") (2008a) Bayesian, Markov Chain Monte Carlo fingerprinting framework for estimating sediment source contributions and their associated uncertainties is presented. The F-P framework included two key deterministic parameters, α and β, that, respectively, reflected the spatial origin attributes of sources and the time history of eroded material delivered to and collected at the watershed outlet. However, the deterministic treatment of α and β is limited to cases with well-defined spatial partitioning of sources, high sediment delivery, and relatively short travel times with little variability in transport within the watershed. For event-based studies in intensively managed landscapes, this may be inadequate since landscape heterogeneity results in variabilities in source contributions, their pathways, delivery times, and storage within the watershed. Thus, probabilistic treatments of α and β are implemented in the enhanced framework to account for these variabilities. To evaluate the effects of the treatments of α and β on source partitioning, both frameworks are applied to the South Amana Subwatershed (SASW) in the U.S. midwest. The enhanced framework is found to estimate mean source contributions that are in good agreement with estimates from other studies in SASW. The enhanced framework is also able to produce expected trends in uncertainty during the study period, unlike the F-P framework, which does not perform as expected. Overall, the enhanced framework is found to be less sensitive to changes in α and β than the F-P framework, and, therefore, is more robust and desirable from a management standpoint.
Prasad, Sudhakar
2014-01-01
We present an asymptotic analysis of the minimum probability of error (MPE) in inferring the correct hypothesis in a Bayesian multi-hypothesis testing (MHT) formalism using many pixels of data that are corrupted by signal dependent shot noise, sensor read noise, and background illumination. We perform this error analysis for a variety of combined noise and background statistics, including a pseudo-Gaussian distribution that can be employed to treat approximately the photon-counting statistics of signal and background as well as purely Gaussian sensor read-out noise and more general, exponentially peaked distributions. We subsequently apply the MPE asymptotics to characterize the minimum conditions needed to localize a point source in three dimensions by means of a rotating-PSF imager and compare its performance with that of a conventional imager in the presence of background and sensor-noise fluctuations. In a separate paper, we apply the formalism to the related but qualitatively different problem of 2D supe...
Source Separation with One Ear: Proposition for an Anthropomorphic Approach
Directory of Open Access Journals (Sweden)
Ramin Pichevar
2005-06-01
Full Text Available We present an example of an anthropomorphic approach, in which auditory-based cues are combined with temporal correlation to implement a source separation system. The auditory features are based on spectral amplitude modulation and energy information obtained through 256 cochlear filters. Segmentation and binding of auditory objects are performed with a two-layered spiking neural network. The first layer performs the segmentation of the auditory images into objects, while the second layer binds the auditory objects belonging to the same source. The binding is further used to generate a mask (binary gain to suppress the undesired sources from the original signal. Results are presented for a double-voiced (2 speakers speech segment and for sentences corrupted with different noise sources. Comparative results are also given using PESQ (perceptual evaluation of speech quality scores. The spiking neural network is fully adaptive and unsupervised.
An autonomous surveillance system for blind sources localization and separation
Wu, Sean; Kulkarni, Raghavendra; Duraiswamy, Srikanth
2013-05-01
This paper aims at developing a new technology that will enable one to conduct an autonomous and silent surveillance to monitor sound sources stationary or moving in 3D space and a blind separation of target acoustic signals. The underlying principle of this technology is a hybrid approach that uses: 1) passive sonic detection and ranging method that consists of iterative triangulation and redundant checking to locate the Cartesian coordinates of arbitrary sound sources in 3D space, 2) advanced signal processing to sanitizing the measured data and enhance signal to noise ratio, and 3) short-time source localization and separation to extract the target acoustic signals from the directly measured mixed ones. A prototype based on this technology has been developed and its hardware includes six B and K 1/4-in condenser microphones, Type 4935, two 4-channel data acquisition units, Type NI-9234, with a maximum sampling rate of 51.2kS/s per channel, one NI-cDAQ 9174 chassis, a thermometer to measure the air temperature, a camera to view the relative positions of located sources, and a laptop to control data acquisition and post processing. Test results for locating arbitrary sound sources emitting continuous, random, impulsive, and transient signals, and blind separation of signals in various non-ideal environments is presented. This system is invisible to any anti-surveillance device since it uses the acoustic signal emitted by a target source. It can be mounted on a robot or an unmanned vehicle to perform various covert operations, including intelligence gathering in an open or a confined field, or to carry out the rescue mission to search people trapped inside ruins or buried under wreckages.
Blind source separation advances in theory, algorithms and applications
Wang, Wenwu
2014-01-01
Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.
Blind Source Separation with Compressively Sensed Linear Mixtures
Kleinsteuber, Martin
2011-01-01
This work studies the problem of simultaneously separating and reconstructing signals from compressively sensed linear mixtures. We assume that all source signals share a common sparse representation basis. The approach combines classical Compressive Sensing (CS) theory with a linear mixing model. It allows the mixtures to be sampled independently of each other. If samples are acquired in the time domain, this means that the sensors need not be synchronized. Since Blind Source Separation (BSS) from a linear mixture is only possible up to permutation and scaling, factoring out these ambiguities leads to a minimization problem on the so-called oblique manifold. We develop a geometric conjugate subgradient method that scales to large systems for solving the problem. Numerical results demonstrate the promising performance of the proposed algorithm compared to several state of the art methods.
Immobilized Microalgae for Nutrient Recovery from Source Separated Urine
Piltz, Bastian
2016-01-01
Shortages in supply of nutrients and freshwater for a growing human population are critical global issues. Traditional centralized sewage treatment can prevent eutrophication and provide sanitation, but is neither efficient nor sustainable in terms of water and resources. Source separation of household wastes, combined with decentralized resource recovery, presents a novel approach to solve these issues. Urine contains within 1 % of household waste water up to 80 % of the nitrogen (N) and 50 ...
Online blind source separation based on joint diagonalization
Institute of Scientific and Technical Information of China (English)
Li Ronghua; Zhou Guoxu; Fang Zuyuan; Xie Shengli
2009-01-01
A now algorithm is proposed for joint diagonalization. With a modified objective function, the now algorithm not only excludes trivial and unbalanced solutions successfully, but is also easily optimized. In addition, with the new objective function, the proposed algorithm can work well in online blind source separation (BSS) for the first time, although this family of algorithms is always thought to be valid only in batch-mode BSS by far. Simulations show that it is a very competitive joint diagonalization algorithm.
Bayesian estimation of a source term of radiation release with approximately known nuclide ratios
Tichý, Ondřej; Šmídl, Václav; Hofman, Radek
2016-04-01
We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases
Energy Technology Data Exchange (ETDEWEB)
George, J.S.; Schmidt, D.M.; Wood, C.C.
1999-02-01
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.
How many separable sources? Model selection in independent components analysis.
Woods, Roger P; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian.
A preliminary study on spatial unmasking of virtual separated sources
Institute of Scientific and Technical Information of China (English)
XIE ZhiWen; JIN Jing
2008-01-01
An experimental method with headphone virtual reproduction is proposed and a series of experiments to study forward masking effect when the masker and the masked signal are spatially separated in azimuth are conducted. Then, the masking thresholds are compared with those when the masker and the masked signal source are at the same place. The results show that, although both the thresholds of 0°and±30° sound images increase with the sound pressure level (SPL) of the masker, spatial unmasking may be really observed. The maximum unmasking is as large as 15 dB. This spatial unmasking effect is mainly attributed to better-ear con-tribution.
A preliminary study on spatial unmasking of virtual separated sources
Institute of Scientific and Technical Information of China (English)
2008-01-01
An experimental method with headphone virtual reproduction is proposed and a series of experiments to study forward masking effect when the masker and the masked signal are spatially separated in azimuth are conducted. Then, the masking thresholds are compared with those when the masker and the masked signal source are at the same place. The results show that, although both the thresholds of 0° and ±30° sound images increase with the sound pressure level (SPL) of the masker, spatial unmasking may be really observed. The maximum unmasking is as large as 15 dB. This spatial unmasking effect is mainly attributed to better-ear contribution.
Pirone, Jason R; Smith, Marjolein; Kleinstreuer, Nicole C; Burns, Thomas A; Strickland, Judy; Dancik, Yuri; Morris, Richard; Rinckel, Lori A; Casey, Warren; Jaworska, Joanna S
2014-01-01
An open-source implementation of a previously published integrated testing strategy (ITS) for skin sensitization using a Bayesian network has been developed using R, a free and open-source statistical computing language. The ITS model provides probabilistic predictions of skin sensitization potency based on in silico and in vitro information as well as skin penetration characteristics from a published bioavailability model (Kasting et al., 2008). The structure of the Bayesian network was designed to be consistent with the adverse outcome pathway published by the OECD (Jaworska et al., 2011, 2013). In this paper, the previously published data set (Jaworska et al., 2013) is improved by two data corrections and a modified application of the Kasting model. The new data set implemented in the original commercial software package and the new R version produced consistent results. The data and a fully documented version of the code are publicly available (http://ntp.niehs.nih.gov/go/its).
Ransom, Katherine M.; Grote, Mark N.; Deinhart, Amanda; Eppich, Gary; Kendall, Carol; Sanborn, Matthew E.; Souders, A. Kate; Wimpenny, Joshua; Yin, Qing-zhu; Young, Megan; Harter, Thomas
2016-07-01
Groundwater quality is a concern in alluvial aquifers that underlie agricultural areas, such as in the San Joaquin Valley of California. Shallow domestic wells (less than 150 m deep) in agricultural areas are often contaminated by nitrate. Agricultural and rural nitrate sources include dairy manure, synthetic fertilizers, and septic waste. Knowledge of the relative proportion that each of these sources contributes to nitrate concentration in individual wells can aid future regulatory and land management decisions. We show that nitrogen and oxygen isotopes of nitrate, boron isotopes, and iodine concentrations are a useful, novel combination of groundwater tracers to differentiate between manure, fertilizers, septic waste, and natural sources of nitrate. Furthermore, in this work, we develop a new Bayesian mixing model in which these isotopic and elemental tracers were used to estimate the probability distribution of the fractional contributions of manure, fertilizers, septic waste, and natural sources to the nitrate concentration found in an individual well. The approach was applied to 56 nitrate-impacted private domestic wells located in the San Joaquin Valley. Model analysis found that some domestic wells were clearly dominated by the manure source and suggests evidence for majority contributions from either the septic or fertilizer source for other wells. But, predictions of fractional contributions for septic and fertilizer sources were often of similar magnitude, perhaps because modeled uncertainty about the fraction of each was large. For validation of the Bayesian model, fractional estimates were compared to surrounding land use and estimated source contributions were broadly consistent with nearby land use types.
The Leuven isotope separator on-line laser ion source
Kudryavtsev, Y; Franchoo, S; Huyse, M; Gentens, J; Kruglov, K; Müller, W F; Prasad, N V S; Raabe, R; Reusen, I; Van den Bergh, P; Van Duppen, P; Van Roosbroeck, J; Vermeeren, L; Weissman, L
2002-01-01
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. sup 5 sup 4 sup , sup 5 sup 5 Ni and sup 5 sup 4 sup , sup 5 sup 5 Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of sup 2 sup 3 sup 8 U. The b decay of the sup 6 sup 8 sup - sup 7 sup 4 Ni, sup 6 sup 7 sup - sup 7 sup 0 Co, sup 7 sup 0 sup - sup 7 sup 5 Cu and sup 1 sup 1 sup 0 sup - sup 1 sup 1 sup 4 Rh isotopes has been studied by means of beta-gamma and gamma-gamma spectroscopy. Recently, the laser ion source has been used to produce neutron-d...
基于Bayesian-MCMC方法的水体污染识别反问题%Event Source Identification of Water Pollution Based on Bayesian-MCMC
Institute of Scientific and Technical Information of China (English)
陈海洋; 滕彦国; 王金生; 宋柳霆; 周振瑶
2012-01-01
For the ill-posed environment hydraulic inverse problem, a methodical model was constructed based on Bayesian inference and two-dimensional water quality model. Markov chain Monte Carlo simulation was applied to get posterior probability distribution of the source's position, intensity and event init time. The result of case study shows that the method based on Bayesian inference with Markov chain Monte Carlo simulation is fit for inverse problem such as contamination event source identification featuring high accuracy and little error. Compared with the identification results of hybrid genetic algorithm and pattern search, the presented approach indicated high stability and robust on the same inverse problem.%针对具有不适定性的环境水力学反问题,基于贝叶斯推理和二维水质模型建立水体污染识别反演模型,运用马尔科夫链蒙特卡罗法抽样获得污染源源强、污染源位置和污染泄漏时间等模型参数的后验概率分布和统计结果.实例研究结果表明,基于马尔科夫链蒙特卡罗抽样算法的贝叶斯推理可以较好地用来实现水体污染识别,具有识别精度高,误差小的特点,其可靠性和稳定性高于混合遗传-模式搜索优化算法.
Agarwal, Mukul; Mitter, Sanjoy
2009-01-01
An operational perspective is used to understand the relationship between source and channel coding. This is based on a direct reduction of one problem to another that uses random coding (and hence common randomness) but unlike all prior work, does not involve any functional computations, in particular, no mutual-information computations. This result is then used to prove a universal source-channel separation theorem in the rate-distortion context where universality is in the sense of a compound ``general channel.''
Blind Source Separation Algorithms for PSF Subtraction from Direct Imaging
Shapiro, Jacob; Ranganathan, Nikhil; Savransky, Dmitry; Ruffio, Jean-Baptise; Macintosh, Bruce; GPIES Team
2017-01-01
The principal difficulty with detecting planets via direct imaging is that the target signal is similar in magnitude, or fainter, than the noise sources in the image. To compensate for this, several methods exist to subtract the PSF of the host star and other confounding noise sources. One of the most effective methods is Karhunen-Loève Image Processing (KLIP). The core algorithm within KLIP is Principal Component Analysis, which is a member of a class of algorithms called Blind Source Separation (BSS).We examine three other BSS algorithms that may potentially also be used for PSF subtraction: Independent Component Analysis, Stationary Subspace Analysis, and Common Spatial Pattern Filtering. The underlying principles of each of the algorithms is discussed, as well as the processing steps needed to achieve PSF subtraction. The algorithms are examined both as primary PSF subtraction techniques, as well as additional postprocessing steps used with KLIP.These algorithms have been used on data from the Gemini Planet Imager, analyzing images of β Pic b. To build a reference library, both Angular Differential Imaging and Spectral Differential Imaging were used. To compare to KLIP, three major metrics are examined: computation time, signal-to-noise ratio, and astrometric and photometric biases in different image regimes (e.g., speckle-dominated compared to Poisson-noise dominated). Preliminary results indicate that these BSS algorithms improve performance when used as an enhancement for KLIP, and that they can achieve similar SNR when used as the primary method of PSF subtraction.
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
Exploiting Narrowband Efficiency for Broadband Convolutive Blind Source Separation
Directory of Open Access Journals (Sweden)
Aichner Robert
2007-01-01
Full Text Available Based on a recently presented generic broadband blind source separation (BSS algorithm for convolutive mixtures, we propose in this paper a novel algorithm combining advantages of broadband algorithms with the computational efficiency of narrowband techniques. By selective application of the Szegö theorem which relates properties of Toeplitz and circulant matrices, a new normalization is derived as a special case of the generic broadband algorithm. This results in a computationally efficient and fast converging algorithm without introducing typical narrowband problems such as the internal permutation problem or circularity effects. Moreover, a novel regularization method for the generic broadband algorithm is proposed and subsequently also derived for the proposed algorithm. Experimental results in realistic acoustic environments show improved performance of the novel algorithm compared to previous approximations.
Sources of resonant sound in separated duct flows
Hourigan, K.; Welsh, M. C.; Thompson, M. C.; Stokes, A. N.
An overview of studies at the Commonwealth Scientific and Industrial Research Organization of Australia involving the generation of sound in separating flows observed in wind tunnels is presented. Numerical experiments to pinpoint the sources of sound in these flows are reported. It is found that resonant sound can be sustained by the correct phasing of sound power generation by a vortex with the resonant acoustic field. A feedback loop is established whereby the vortex shedding itself is locked to the sound field. For a given acoustic mode and frequency, the existence of vortex shedding correctly phased to generate net acoustic power is a function of the flow velocity; acoustic resonance then appears periodically as the flow velocity is increased.
Asynchronously sampled blind source separation for coherent optical links
Detwiler, Thomas F.; Searcy, Steven M.; Stark, Andrew J.; Ralph, Stephen E.; Basch, Bert E.
2011-01-01
Polarization multiplexing is an integral technique for generating spectrally efficient 100 Gb/s and higher optical links. Post coherent detection DSP-based polarization demultiplexing of QPSK links is commonly performed after timing recovery. We propose and demonstrate a method of asynchronous blind source separation using the constant modulus algorithm (CMA) on the asynchronously sampled signal to initially separate energy from arbitrarily aligned polarization states. This method lends well to implementation as it allows for an open-loop sampling frequency for analog-to-digital conversion at less than twice the symbol rate. We show that the performance of subsequent receiver functions is enhanced by the initial pol demux operation. CMA singularity behavior is avoided through tap settling constraints. The method is applicable to QPSK transmissions and many other modulation formats as well, including general QAM signals, offset-QPSK, and CPM, or a combination thereof. We present the architecture and its performance under several different formats and link conditions. Comparisons of complexity and performance are drawn between the proposed architecture and conventional receivers.
Kopka, Piotr; Wawrzynczak, Anna; Borysiewicz, Mieczyslaw
2016-11-01
In this paper the Bayesian methodology, known as Approximate Bayesian Computation (ABC), is applied to the problem of the atmospheric contamination source identification. The algorithm input data are on-line arriving concentrations of the released substance registered by the distributed sensors network. This paper presents the Sequential ABC algorithm in detail and tests its efficiency in estimation of probabilistic distributions of atmospheric release parameters of a mobile contamination source. The developed algorithms are tested using the data from Over-Land Atmospheric Diffusion (OLAD) field tracer experiment. The paper demonstrates estimation of seven parameters characterizing the contamination source, i.e.: contamination source starting position (x,y), the direction of the motion of the source (d), its velocity (v), release rate (q), start time of release (ts) and its duration (td). The online-arriving new concentrations dynamically update the probability distributions of search parameters. The atmospheric dispersion Second-order Closure Integrated PUFF (SCIPUFF) Model is used as the forward model to predict the concentrations at the sensors locations.
Blind source separation with unknown and dynamically changing number of source signals
Institute of Scientific and Technical Information of China (English)
YE Jimin; ZHANG Xianda; ZHU Xiaolong
2006-01-01
The contrast function remains to be an open problem in blind source separation (BSS) when the number of source signals is unknown and/or dynamically changed.The paper studies this problem and proves that the mutual information is still the contrast function for BSS if the mixing matrix is of full column rank. The mutual information reaches its minimum at the separation points, where the random outputs of the BSS system are the scaled and permuted source signals, while the others are zero outputs. Using the property that the transpose of the mixing matrix and a matrix composed by m observed signals have the indentical null space with probability one, a practical method, which can detect the unknown number of source signals n, ulteriorly traces the dynamical change of the sources number with a few of data, is proposed. The effectiveness of the proposed theorey and the developed novel algorithm is verified by adaptive BSS simulations with unknown and dynamically changing number of source signals.
Xia, Yongqiu; Li, Yuefei; Zhang, Xinyu; Yan, Xiaoyuan
2017-01-01
Nitrate (NO3-) pollution is a serious problem worldwide, particularly in countries with intensive agricultural and population activities. Previous studies have used δ15N-NO3- and δ18O-NO3- to determine the NO3- sources in rivers. However, this approach is subject to substantial uncertainties and limitations because of the numerous NO3- sources, the wide isotopic ranges, and the existing isotopic fractionations. In this study, we outline a combined procedure for improving the determination of NO3- sources in a paddy agriculture-urban gradient watershed in eastern China. First, the main sources of NO3- in the Qinhuai River were examined by the dual-isotope biplot approach, in which we narrowed the isotope ranges using site-specific isotopic results. Next, the bacterial groups and chemical properties of the river water were analyzed to verify these sources. Finally, we introduced a Bayesian model to apportion the spatiotemporal variations of the NO3- sources. Denitrification was first incorporated into the Bayesian model because denitrification plays an important role in the nitrogen pathway. The results showed that fertilizer contributed large amounts of NO3- to the surface water in traditional agricultural regions, whereas manure effluents were the dominant NO3- source in intensified agricultural regions, especially during the wet seasons. Sewage effluents were important in all three land uses and exhibited great differences between the dry season and the wet season. This combined analysis quantitatively delineates the proportion of NO3- sources from paddy agriculture to urban river water for both dry and wet seasons and incorporates isotopic fractionation and uncertainties in the source compositions.
A new geometric approach to blind source separation of bounded sources
Institute of Scientific and Technical Information of China (English)
Jinlong Zhang; Guoxu Zhou; Zuyuan Yang; Xiaoxin Liao
2009-01-01
Based on the minimum-range approach, a new geometric approach is proposed to deal with blind source separation in this paper. The new approach is the batch mode of the original minimum-range approach. Compared with the original approach, the optimization algo-rithm of the proposed approach needs no parameters and is more efficient and reliable. In addition, the extension of minimum-range-based approaches is discussed. The simulations show the efficiency of the proposed approach.
Hedlund, Jonas
2014-01-01
This paper introduces private sender information into a sender-receiver game of Bayesian persuasion with monotonic sender preferences. I derive properties of increasing differences related to the precision of signals and use these to fully characterize the set of equilibria robust to the intuitive criterion. In particular, all such equilibria are either separating, i.e., the sender's choice of signal reveals his private information to the receiver, or fully disclosing, i.e., the outcome of th...
Ash in composting of source-separated catering waste.
Koivula, Niina; Räikkönen, Tarja; Urpilainen, Sari; Ranta, Jussi; Hänninen, Kari
2004-07-01
Our earlier experiments in small composters (220 l) indicated the favourable effect of ash from co-incineration of sorted dry waste on the composting of catering waste. The aim of this new study was to clarify further, at a scale of 10 m3, the feasibility of using similar ash as an additive in composting. Source-separated catering waste was mixed with bulking agent (peat and wood chips) and fuel ash from a small (4 MW) district heating power plant. Three compost mixes (CM) were obtained: CM I with 0%, CM II with 10% and CM III with 20 wt.% of fuel ash. These three different mixes were composted in a 10-m3 drum composter as three parallel experiments for 2 weeks each, from January to April 2000. After drum composting, masses were placed according to mixing proportions in separate curing piles. The catering waste fed to the drum was cold, sometimes icy. Even then the temperature rapidly increased to over 50 degrees C. In CM III, the temperature rose as high as 80 degrees C, and after the first week of composting the temperature was about 20 degrees C higher in the CMs II and III than in the CM I. It also improved the oxygen concentrations at the feeding end of the drum and obviously prevented the formation of H2S. No odour problems arose during the composting. Addition of ash increased the heavy metal contents of the composting masses, but the compost was suitable for cultivation or green area construction. Ash clearly decreased the loss of total nitrogen in a time span of 2 years. The lower amounts of nitrogen mean that the amounts applied per hectare can be greater than for normal composts. Measured by mineralization, the breaking down of the organic matter was more rapid in the CM III than in the CM I. Humic acid increased steadily during first 12 months composting, from the initial 39 mg/g organic matter to 115 and 137 mg/g in CMs II and III. Measured by temperature, mineralization and humification the addition of ash appeared to boost the composting. Ash had
Directory of Open Access Journals (Sweden)
Meng Wang
2016-08-01
Full Text Available A high concentration of nitrate (NO3− in surface water threatens aquatic systems and human health. Revealing nitrate characteristics and identifying its sources are fundamental to making effective water management strategies. However, nitrate sources in multi-tributaries and mix land use watersheds remain unclear. In this study, based on 20 surface water sampling sites for more than two years’ monitoring from April 2012 to December 2014, water chemical and dual isotopic approaches (δ15N-NO3− and δ18O-NO3− were integrated for the first time to evaluate nitrate characteristics and sources in the Huashan watershed, Jianghuai hilly region, China. Nitrate-nitrogen concentrations (ranging from 0.02 to 8.57 mg/L were spatially heterogeneous that were influenced by hydrogeological and land use conditions. Proportional contributions of five potential nitrate sources (i.e., precipitation; manure and sewage, M & S; soil nitrogen, NS; nitrate fertilizer; nitrate derived from ammonia fertilizer and rainfall were estimated by using a Bayesian isotope mixing model. The results showed that nitrate sources contributions varied significantly among different rainfall conditions and land use types. As for the whole watershed, M & S (manure and sewage and NS (soil nitrogen were major nitrate sources in both wet and dry seasons (from 28% to 36% for manure and sewage and from 24% to 27% for soil nitrogen, respectively. Overall, combining a dual isotopes method with a Bayesian isotope mixing model offered a useful and practical way to qualitatively analyze nitrate sources and transformations as well as quantitatively estimate the contributions of potential nitrate sources in drinking water source watersheds, Jianghuai hilly region, eastern China.
Towards Enhanced Underwater Lidar Detection via Source Separation
Illig, David W.
Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical
Residents’ Household Solid Waste (HSW Source Separation Activity: A Case Study of Suzhou, China
Directory of Open Access Journals (Sweden)
Hua Zhang
2014-09-01
Full Text Available Though the Suzhou government has provided household solid waste (HSW source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residents’ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% of the respondents in Suzhou thought they knew how to source separate HSW, and 29% of them have source separated HSW accurately. The results also found that the current HSW source separation pilot program in Suzhou is valid, as HSW source separation facilities and residents’ separation behavior both became better and better along with the program implementation. The main determinants of residents’ HSW source separation behavior are residents’ age, HSW source separation facilities and government preferential policies. The accessibility to waste management service is particularly important. Attitudes and willingness do not have significant impacts on residents’ HSW source separation behavior.
Bayesian Integration of Isotope Ratios for Geographic Sourcing of Castor Beans
Energy Technology Data Exchange (ETDEWEB)
Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Hart, Garret L.; Ehleringer, James; West, Jason B.; Gill, Gary A.; Duckworth, Douglas C.
2012-08-15
Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 6 0 . 9 {+-} 2 . 1 % versus 5 5 . 9 {+-} 2 . 1 % and 4 0 . 2 {+-} 1 . 8 % for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.
Directory of Open Access Journals (Sweden)
Lin Wang
2010-01-01
Full Text Available Frequency-domain blind source separation (BSS performs poorly in high reverberation because the independence assumption collapses at each frequency bins when the number of bins increases. To improve the separation result, this paper proposes a method which combines two techniques by using beamforming as a preprocessor of blind source separation. With the sound source locations supposed to be known, the mixed signals are dereverberated and enhanced by beamforming; then the beamformed signals are further separated by blind source separation. To implement the proposed method, a superdirective fixed beamformer is designed for beamforming, and an interfrequency dependence-based permutation alignment scheme is presented for frequency-domain blind source separation. With beamforming shortening mixing filters and reducing noise before blind source separation, the combined method works better in reverberation. The performance of the proposed method is investigated by separating up to 4 sources in different environments with reverberation time from 100 ms to 700 ms. Simulation results verify the outperformance of the proposed method over using beamforming or blind source separation alone. Analysis demonstrates that the proposed method is computationally efficient and appropriate for real-time processing.
Single-channel source separation using non-negative matrix factorization
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard
, in which a number of methods for single-channel source separation based on non-negative matrix factorization are presented. In the papers, the methods are applied to separating audio signals such as speech and musical instruments and separating different types of tissue in chemical shift imaging.......Single-channel source separation problems occur when a number of sources emit signals that are mixed and recorded by a single sensor, and we are interested in estimating the original source signals based on the recorded mixture. This problem, which occurs in many sciences, is inherently under......-determined and its solution relies on making appropriate assumptions concerning the sources. This dissertation is concerned with model-based probabilistic single-channel source separation based on non-negative matrix factorization, and consists of two parts: i) three introductory chapters and ii) five published...
Model of municipal solid waste source separation activity: a case study of Beijing.
Yang, Lei; Li, Zhen-Shan; Fu, Hui-Zhen
2011-02-01
One major challenge faced by Beijing is dealing with the enormous amount of municipal solid waste (MSW) generated, which contains a high percentage of food waste. Source separation is considered an effective means of reducing waste and enhancing recycling. However, few studies have focused on quantification of the mechanism of source separation activity. Therefore, this study was conducted to establish a mathematical model of source separation activity (MSSA) that correlates the source separation ratio with the following parameters: separation facilities, awareness, separation transportation, participation atmosphere, environmental profit, sense of honor, and economic profit. The MSSA consisted of two equations, one related to the behavior generation stage and one related to the behavior stability stage. The source separation ratios of the residential community, office building, and primary and middle school were calculated using the MSSA. Data for analysis were obtained from a 1-yr investigation and a questionnaire conducted at 128 MSW clusters around Beijing. The results revealed that office buildings had an initial separation ratio of 80% and a stable separation ratio of 65.86%, whereas residential communities and primary and middle schools did not have a stable separation ratio. The MSSA curve took on two shapes. In addition, internal motivations and the separation transportation ratio were found to be key parameters of the MSSA. This model can be utilized for other cities and countries.
Dettmer, J.; Hossen, M. J.; Cummins, P. R.
2014-12-01
This paper develops a Bayesian inversion to infer spatio-temporal parameters of the tsunami source (sea surface) due to megathrust earthquakes. To date, tsunami-source parameter uncertainties are poorly studied. In particular, the effects of parametrization choices (e.g., discretisation, finite rupture velocity, dispersion) on uncertainties have not been quantified. This approach is based on a trans-dimensional self-parametrization of the sea surface, avoids regularization, and provides rigorous uncertainty estimation that accounts for model-selection ambiguity associated with the source discretisation. The sea surface is parametrized using self-adapting irregular grids which match the local resolving power of the data and provide parsimonious solutions for complex source characteristics. Finite and spatially variable rupture velocity fields are addressed by obtaining causal delay times from the Eikonal equation. Data are considered from ocean-bottom pressure and coastal wave gauges. Data predictions are based on Green-function libraries computed from ocean-basin scale tsunami models for cases that include/exclude dispersion effects. Green functions are computed for elementary waves of Gaussian shape and grid spacing which is below the resolution of the data. The inversion is applied to tsunami waveforms from the great Mw=9.0 2011 Tohoku-Oki (Japan) earthquake. Posterior results show a strongly elongated tsunami source along the Japan trench, as obtained in previous studies. However, we find that the tsunami data is fit with a source that is generally simpler than obtained in other studies, with a maximum amplitude less than 5 m. In addition, the data are sensitive to the spatial variability of rupture velocity and require a kinematic source model to obtain satisfactory fits which is consistent with other work employing linear multiple time-window parametrizations.
Energy Technology Data Exchange (ETDEWEB)
Keats, A.; Lien, F.S. [Waterloo Univ., ON (Canada). Dept. of Mechanical Engineering; Yee, E. [Defence Research and Development Canada, Medicine Hat, AB (Canada)
2006-07-01
A Bayesian probabilistic inferential framework capable of incorporating errors and prior information was presented. Bayesian inference was used to find the posterior probability density function of the source parameters in a set of concentration measurements. A method of calculating the source-receptor relationship required for the determination of direct probability was provided which used the adjoint of the transport equation for the scalar concentration. The posterior distribution of the source parameters was sampled using a Markov chain Monte Carlo method. The inverse source determination method was validated against real data sets obtained from a highly disturbed, complex flow field in an urban environment. Data sets included a water-channel simulation of near-field dispersion of contaminant plumes in a large array of building-like obstacles, and a full-scale experiment in Oklahoma City. It was concluded that the 2 examples validated the proposed approach for inverse source determination.
30 CFR 57.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... NONMETAL MINES Explosives Electric Blasting-Surface and Underground § 57.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall...
30 CFR 56.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... MINES Explosives Electric Blasting § 56.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be locked in the open position...
A LASER ION-SOURCE FOR ONLINE MASS SEPARATION
VANDUPPEN, P; DENDOOVEN, P; HUYSE, M; VERMEEREN, L; QAMHIEH, ZN; SILVERANS, RE; VANDEWEERT, E
1992-01-01
A laser ion source based on resonance photo ionization in a gas cell is proposed. The gas cell, filled with helium, consists of a target chamber in which the recoil products are stopped and neutralized, and an ionization chamber where the atoms of interest are selectively ionized by the laser light.
Fixed-point blind source separation algorithm based on ICA
Institute of Scientific and Technical Information of China (English)
Hongyan LI; Jianfen MA; Deng'ao LI; Huakui WANG
2008-01-01
This paper introduces the fixed-point learning algorithm based on independent component analysis (ICA);the model and process of this algorithm and simulation results are presented.Kurtosis was adopted as the estimation rule of independence.The results of the experiment show that compared with the traditional ICA algorithm based on random grads,this algorithm has advantages such as fast convergence and no necessity for any dynamic parameter,etc.The algorithm is a highly efficient and reliable method in blind signal separation.
A Bayesian formulation of seismic fragility analysis of safety related equipment
Energy Technology Data Exchange (ETDEWEB)
Wang, Z-L.; Pandey, M.; Xie, W-C., E-mail: z268wang@uwaterloo.ca, E-mail: mdpandey@uwaterloo.ca, E-mail: xie@uwaterloo.ca [Univ. of Waterloo, Ontario (Canada)
2013-07-01
A Bayesian approach to seismic fragility analysis of safety-related equipment is formulated. Unlike treating two sources of uncertainty of in the parameter estimation in two steps separately using the classical statistics, a Bayesian hierarchical model is advocated for interpreting and combining the various uncertainties more clearly in this article. In addition, with the availability of additional earthquake experience data and shaking table test results, a Bayesian approach to updating the fragility model of safety-related equipment is formulated by incorporating acquired failure and survivor evidence. Numerical results show the significance in fragility analysis using the Bayesian approach. (author)
Gaspar, Leticia; Owens, Philip; Petticrew, Ellen; Lobb, David; Koiter, Alexander; Reiffarth, Dominic; Barthod, Louise; Liu, Kui; Martinez-Carreras, Nuria
2015-04-01
An understanding of sediment redistribution processes and the main sediment sources within a watershed is needed to support catchment management strategies, to control soil erosion processes, and to preserve water quality and ecological status. The fingerprinting technique is increasingly recognised as a method for establishing the source of the sediment transported within a catchment. However, the different behaviour of the various fingerprinting properties has been recognised as a major limitation of the technique, and the uncertainty associated with tracer selection has to be addressed. Do the different properties give similar results? Can we combine different groups of tracers? This study aims to compare and evaluate the differences between fingerprinting predictions provided by a Bayesian mixing model using different groups of tracer properties for use in sediment source identification. We are employing fallout radionuclides (137Cs, 210Pbex) and geochemical elements as conventional fingerprinting properties, and colour parameters and compound-specific stable isotopes (CSSIs) as emerging properties; both alone and in combination. These fingerprinting properties are being used to determine the proportional contributions of fine sediment in the South Tobacco Creek Watershed, an agricultural catchment located in south-central Manitoba in Canada. We present preliminary results to evaluate the use of different statistical procedures to increase the accuracy of fingerprinting outputs and establish protocols for the selection of appropriate fingerprint properties.
Liu, Xianhua; Randall, R. B.
2005-11-01
Internal combustion engines have several vibration sources, such as combustion, fuel injection, piston slap and valve operation. For machine condition monitoring or design improvement purposes, it is necessary to separate the vibration signals caused by different sources and then analyse each of them individually. However, traditional frequency analysis techniques are not very useful due to overlap of the different sources over a wide frequency range. This paper attempts to separate the vibration sources, especially piston slap, by using blind source separation techniques with the intention of revealing the potential of the new technique for solving mechanical vibration problems. The BSS method and the Blind least mean square algorithm using Gray's variable norm as a measure of non-Gaussianity of the sources is briefly described and separation results for both simulated and measured data are presented and discussed.
A Bayesian Approach to Discovering Truth from Conflicting Sources for Data Integration
Zhao, Bo; Gemmell, Jim; Han, Jiawei
2012-01-01
In practical data integration systems, it is common for the data sources being integrated to provide conflicting information about the same entity. Consequently, a major challenge for data integration is to derive the most complete and accurate integrated records from diverse and sometimes conflicting sources. We term this challenge the truth finding problem. We observe that some sources are generally more reliable than others, and therefore a good model of source quality is the key to solving the truth finding problem. In this work, we propose a probabilistic graphical model that can automatically infer true records and source quality without any supervision. In contrast to previous methods, our principled approach leverages a generative process of two types of errors (false positive and false negative) by modeling two different aspects of source quality. In so doing, ours is also the first approach designed to merge multi-valued attribute types. Our method is scalable, due to an efficient sampling-based inf...
Jakkareddy, Pradeep S.; Balaji, C.
2016-09-01
This paper employs the Bayesian based Metropolis Hasting - Markov Chain Monte Carlo algorithm to solve inverse heat transfer problem of determining the spatially varying heat transfer coefficient from a flat plate with flush mounted discrete heat sources with measured temperatures at the bottom of the plate. The Nusselt number is assumed to be of the form Nu = aReb(x/l)c . To input reasonable values of ’a’ and ‘b’ into the inverse problem, first limited two dimensional conjugate convection simulations were done with Comsol. Based on the guidance from this different values of ‘a’ and ‘b’ are input to a computationally less complex problem of conjugate conduction in the flat plate (15mm thickness) and temperature distributions at the bottom of the plate which is a more convenient location for measuring the temperatures without disturbing the flow were obtained. Since the goal of this work is to demonstrate the eficiacy of the Bayesian approach to accurately retrieve ‘a’ and ‘b’, numerically generated temperatures with known values of ‘a’ and ‘b’ are treated as ‘surrogate’ experimental data. The inverse problem is then solved by repeatedly using the forward solutions together with the MH-MCMC aprroach. To speed up the estimation, the forward model is replaced by an artificial neural network. The mean, maximum-a-posteriori and standard deviation of the estimated parameters ‘a’ and ‘b’ are reported. The robustness of the proposed method is examined, by synthetically adding noise to the temperatures.
Student support and perceptions of urine source separation in a university community.
Ishii, Stephanie K L; Boyer, Treavor H
2016-09-01
Urine source separation, i.e., the collection and treatment of human urine as a separate waste stream, has the potential to improve many aspects of water resource management and wastewater treatment. However, social considerations must be taken into consideration for successful implementation of this alternative wastewater system. This work evaluated the perceptions of urine source separation held by students living on-campus at a major university in the Southeastern region of the United States. Perceptions were evaluated in the context of the Theory of Planned Behavior. The survey population represents one group within a community type (universities) that is expected to be an excellent testbed for urine source separation. Overall, respondents reported high levels of support for urine source separation after watching a video on expected benefits and risks, e.g., 84% indicated that they would vote in favor of urine source separation in residence halls. Support was less apparent when measured by willingness to pay, as 33% of respondents were unwilling to pay for the implementation of urine source separation and 40% were only willing to pay $1 to $10 per semester. Water conservation was largely identified as the most important benefit of urine source separation and there was little concern reported about the use of urine-based fertilizers. Statistical analyses showed that one's environmental attitude, environmental behavior, perceptions of support within the university community, and belief that student opinions have an impact on university decision makers were significantly correlated with one's support for urine source separation. This work helps identify community characteristics that lend themselves to acceptance of urine source separation, such as those related to environmental attitudes/behaviors and perceptions of behavioral control and subjective norm. Critical aspects of these alternative wastewater systems that require attention in order to foster public
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2011-01-01
We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical s...
Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm
Directory of Open Access Journals (Sweden)
Lei Chen
2014-01-01
Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.
Empirical Study on Factors Influencing Residents' Behavior of Separating Household Wastes at Source
Institute of Scientific and Technical Information of China (English)
Qu Ying; Zhu Qinghua; Murray Haight
2007-01-01
Source separation is the basic premise for making effective use of household wastes. In eight cities of China, however, several pilot projects of source separation finally failed because of the poor participation rate of residents. In order to solve this problem, identifying those factors that influence residents' behavior of source separation becomes crucial. By means of questionnaire survey, we conducted descriptive analysis and exploratory factor analysis. The results show that trouble-feeling, moral notion, environment protection, public education, environment value and knowledge deficiency are the main factors that play an important role for residents in deciding to separate their household wastes. Also, according to the contribution percentage of the six main factors to the total behavior of source separation, their influencing power is analyzed, which will provide suggestions on household waste management for policy makers and decision makers in China.
Source Separation and Higher-Order Causal Analysis of MEG and EEG
Zhang, Kun
2012-01-01
Separation of the sources and analysis of their connectivity have been an important topic in EEG/MEG analysis. To solve this problem in an automatic manner, we propose a two-layer model, in which the sources are conditionally uncorrelated from each other, but not independent; the dependence is caused by the causality in their time-varying variances (envelopes). The model is identified in two steps. We first propose a new source separation technique which takes into account the autocorrelations (which may be time-varying) and time-varying variances of the sources. The causality in the envelopes is then discovered by exploiting a special kind of multivariate GARCH (generalized autoregressive conditional heteroscedasticity) model. The resulting causal diagram gives the effective connectivity between the separated sources; in our experimental results on MEG data, sources with similar functions are grouped together, with negative influences between groups, and the groups are connected via some interesting sources.
Separation of radiation from two sources from their known radiated sum field
DEFF Research Database (Denmark)
Laitinen, Tommi; Pivnenko, Sergey
2011-01-01
This paper presents a technique for complete and exact separation of the radiated fields of two sources (at the same frequency) from the knowledge of their radiated sum field. The two sources can be arbitrary but it must be possible to enclose the sources inside their own non-intersecting minimum...
Semi-blind Source Separation Using Head-Related Transfer Functions
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Hansen, Lars Kai; Kjems, Ulrik;
2004-01-01
An online blind source separation algorithm which is a special case of the geometric algorithm by Parra and Fancourt has been implemented for the purpose of separating sounds recorded at microphones placed at each side of the head. By using the assumption that the position of the two sounds...... are known, the source separation algorithm has been geometrically constrained. Since the separation takes place in a non free-field, a head-related transfer function (HRTF) is used to simulate the response between microphones placed at the two ears. The use of a HRTF instead of assuming free-field improves...
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
DEFF Research Database (Denmark)
Hald, Tine; Vose, D.; Wegener, Henrik Caspar
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused by d...... sources to human salmonellosis has proved to be a valuable tool in risk management in Denmark and provides an example of how to integrate quantitative risk assessment and zoonotic disease surveillance.......Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused...... by different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study.
The Research of Blind Source Separation (BSS)in Machinery Fault Diagnosis
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Blind source separation (BSS) technology is very use ful in many fields, such as communication, radar and so on. Because of the advantage of BSS that it can separate multi-sources even not knowing the mix-coefficient and the probability distribution, it can also be used in fault diagnosis. In this paper, we first use the BSS to deal with the sound from the machinery in fault diagnosis. We make a simulation of two sound sources and four sensors to test the result. Each source is a narrow-band source, which is composed of several sine waves.The result shows that the two sources can be well separated from the mixed signals. So we can draw a conclusion that BSS can inprove the technology of sound fault diagnosis, especially in rotating machinery.``
Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan
2005-01-01
A limitation in many source separation tasks is that the number of source signals has to be known in advance. Further, in order to achieve good performance, the number of sources cannot exceed the number of sensors. In many real-world applications these limitations are too strict. We propose...... can separate up to six mixed speech signals under anechoic conditions. The number of source signals is not assumed to be known in advance. It is also possible to maintain the extracted signals as stereo signals...
Directory of Open Access Journals (Sweden)
Wang Pidong
2016-01-01
Full Text Available Blind source separation is a hot topic in signal processing. Most existing works focus on dealing with linear combined signals, while in practice we always encounter with nonlinear mixed signals. To address the problem of nonlinear source separation, in this paper we propose a novel algorithm using radial basis function neutral network, optimized by multi-universe parallel quantum genetic algorithm. Experiments show the efficiency of the proposed method.
Bayesian Inversion of Concentration Data for an Unknown Number of Contaminant Sources
2007-06-01
récepteur), faisant la relation entre la répartition des sources multiples et les données de concentration mesurées par le réseau de capteurs , est...détection de plus en plus efficace à mesurer la concen- tration d’agents chimiques , biologiques et radiologiques (CBR) libérés dans l’atmosphère...d’agent disséminé (problème d’estimation en termes de sources) qui suit la détection d’un événement au moyen d’un réseau de capteurs CBR
A cost evaluation method for transferring municipalities to solid waste source-separated system.
Lavee, Doron; Nardiya, Shlomit
2013-05-01
Most of Israel's waste is disposed in landfills, threatening scarce land resources and posing environmental and health risks. The aim of this study is to estimate the expected costs of transferring municipalities to solid waste source separation in Israel, aimed at reducing the amount of waste directed to landfills and increasing the efficiency and amount of recycled waste. Information on the expected costs of operating a solid waste source separation system was gathered from 47 municipalities and compiled onto a database, taking into consideration various factors such as costs of equipment, construction adjustments and waste collection and disposal. This database may serve as a model for estimating the costs of entering the waste source separation system for any municipality in Israel, while taking into consideration its specific characteristics, such as size and region. The model was used in Israel for determining municipalities' eligibility to receive a governmental grant for entering an accelerated process of solid waste source separation. This study displays a user-friendly and simple operational tool for assessing municipalities' costs of entering a process of waste source separation, providing policy makers a powerful tool for diverting funds effectively in promoting solid waste source separation.
Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices
Directory of Open Access Journals (Sweden)
Baccigalupi C
2005-01-01
Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.
Zhou, Guoxu; Yang, Zuyuan; Xie, Shengli; Yang, Jun-Mei
2011-04-01
Online blind source separation (BSS) is proposed to overcome the high computational cost problem, which limits the practical applications of traditional batch BSS algorithms. However, the existing online BSS methods are mainly used to separate independent or uncorrelated sources. Recently, nonnegative matrix factorization (NMF) shows great potential to separate the correlative sources, where some constraints are often imposed to overcome the non-uniqueness of the factorization. In this paper, an incremental NMF with volume constraint is derived and utilized for solving online BSS. The volume constraint to the mixing matrix enhances the identifiability of the sources, while the incremental learning mode reduces the computational cost. The proposed method takes advantage of the natural gradient based multiplication updating rule, and it performs especially well in the recovery of dependent sources. Simulations in BSS for dual-energy X-ray images, online encrypted speech signals, and high correlative face images show the validity of the proposed method.
Servière, C.; Lacoume, J.-L.; El Badaoui, M.
2005-11-01
This paper is devoted to blind separation of combustion noise and piston-slap in diesel engines. The two phenomena are recovered only from signals issued from accelerometers placed on one of the cylinders. A blind source separation (BSS) method is developed, based on a convolutive model of non-stationary mixtures. We introduce a new method based on the joint diagonalisation of the time varying spectral matrices of the observation records and a new technique to handle the problem of permutation ambiguity in the frequency domain. This method is then applied to real data and the estimated sources are validated by several physical arguments. So, the contribution of the piston-slap and the combustion noise can be recovered for all the sensors. The energy of the two phenomena can then be given with regards to the position of the accelerometers.
DEFF Research Database (Denmark)
Oh, Geok Lian
This PhD study examines the use of seismic technology for the problem of detecting underground facilities, whereby a seismic source such as a sledgehammer is used to generate seismic waves through the ground, sensed by an array of seismic sensors on the ground surface, and recorded by the digital...... device. The concept is similar to the techniques used in exploration seismology, in which explosions (that occur at or below the surface) or vibration wave-fronts generated at the surface reflect and refract off structures at the ground depth, so as to generate the ground profile of the elastic material...
On-line blind source separation algorithm based on second order statistics
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
An on-line blind source separation (BSS) algorithm is presented in this paper under the assumption that sources are temporarily correlated signals. By using only some of the observed samples in a recursive calculation, the whitening matrix and the rotation matrix could be approximately obtained through the measurement of only one cost function. Simulations show good performance of the algorithm.
A Computational Auditory Scene Analysis-Enhanced Beamforming Approach for Sound Source Separation
Directory of Open Access Journals (Sweden)
L. A. Drake
2009-01-01
Full Text Available Hearing aid users have difficulty hearing target signals, such as speech, in the presence of competing signals or noise. Most solutions proposed to date enhance or extract target signals from background noise and interference based on either location attributes or source attributes. Location attributes typically involve arrival angles at a microphone array. Source attributes include characteristics that are specific to a signal, such as fundamental frequency, or statistical properties that differentiate signals. This paper describes a novel approach to sound source separation, called computational auditory scene analysis-enhanced beamforming (CASA-EB, that achieves increased separation performance by combining the complementary techniques of CASA (a source attribute technique with beamforming (a location attribute technique, complementary in the sense that they use independent attributes for signal separation. CASA-EB performs sound source separation by temporally and spatially filtering a multichannel input signal, and then grouping the resulting signal components into separated signals, based on source and location attributes. Experimental results show increased signal-to-interference ratio with CASA-EB over beamforming or CASA alone.
Applying the Background-Source separation algorithm to Chandra Deep Field South data
Guglielmetti, F; Fischer, R; Rosati, P; Tozzi, P
2012-01-01
A probabilistic two-component mixture model allows one to separate the diffuse background from the celestial sources within a one-step algorithm without data censoring. The background is modeled with a thin-plate spline combined with the satellite's exposure time. Source probability maps are created in a multi-resolution analysis for revealing faint and extended sources. All detected sources are automatically parametrized to produce a list of source positions, fluxes and morphological parameters. The present analysis is applied to the Chandra Deep Field South 2 Ms public released data. Within its 1.884 ks of exposure time and its angular resolution (0.984 arcsec), the Chandra Deep Field South data are particularly suited for testing the Background-Source separation algorithm.
Wang, Yan; Huang, Hong; Huang, Lida; Ristic, Branko
2017-03-01
Source term estimation for atmospheric dispersion deals with estimation of the emission strength and location of an emitting source using all available information, including site description, meteorological data, concentration observations and prior information. In this paper, Bayesian methods for source term estimation are evaluated using Prairie Grass field observations. The methods include those that require the specification of the likelihood function and those which are likelihood free, also known as approximate Bayesian computation (ABC) methods. The performances of five different likelihood functions in the former and six different distance measures in the latter case are compared for each component of the source parameter vector based on Nemenyi test over all the 68 data sets available in the Prairie Grass field experiment. Several likelihood functions and distance measures are introduced to source term estimation for the first time. Also, ABC method is improved in many aspects. Results show that discrepancy measures which refer to likelihood functions and distance measures collectively have significant influence on source estimation. There is no single winning algorithm, but these methods can be used collectively to provide more robust estimates.
Institute of Scientific and Technical Information of China (English)
Yunhan Luo; Houxin Cui; Xiaoyu Gu; Rong Liu; Kexin Xu
2005-01-01
Based on analysis of the relation between mean penetration depth and source-detector separation in a threelayer model with the method of Monte-Carlo simulation, an optimal source-detector separation is derived from the mean penetration depth referring to monitoring the change of chromophores concentration of the sandwiched layer. In order to verify the separation, we perform Monte-Carlo simulations with varied absorption coefficient of the sandwiched layer. All these diffuse reflectances are used to construct a calibration model with the method of partial least square (PLS). High correlation coefficients and low root mean square error of prediction (RMSEP) at the optimal separation have confirmed correctness of the selection. This technique is expected to show light on noninvasive diagnosis of near-infrared spectroscopy.
Blind source separation of ship-radiated noise based on generalized Gaussian model
Institute of Scientific and Technical Information of China (English)
Kong Wei; Yang Bin
2006-01-01
When the distribution of the sources cannot be estimated accurately, the ICA algorithms failed to separate the mixtures blindly. The generalized Gaussian model (GGM) is presented in ICA algorithm since it can model nonGaussian statistical structure of different source signals easily. By inferring only one parameter, a wide class of statistical distributions can be characterized. By using maximum likelihood (ML) approach and natural gradient descent, the learning rules of blind source separation (BSS) based on GGM are presented. The experiment of the ship-radiated noise demonstrates that the GGM can model the distributions of the ship-radiated noise and sea noise efficiently, and the learning rules based on GGM gives more successful separation results after comparing it with several conventional methods such as high order cumulants and Gaussian mixture density function.
DEFF Research Database (Denmark)
Larsen, Anna Warberg; Astrup, Thomas
2011-01-01
CO2-loads from combustible waste are important inputs for national CO2 inventories and life-cycle assessments (LCA). CO2 emissions from waste incinerators are often expressed by emission factors in kg fossil CO2 emitted per GJ energy content of the waste. Various studies have shown considerable...... variations between emission factors for different incinerators, but the background for these variations has not been thoroughly examined. One important reason may be variations in collection of recyclable materials as source separation alters the composition of the residual waste incinerated. The objective...... of this study was to quantify the importance of source separation for determination of emission factors for incineration of residual household waste. This was done by mimicking various source separation scenarios and based on waste composition data calculating resulting emission factors for residual waste...
Institute of Scientific and Technical Information of China (English)
黄晋英; 潘宏侠; 毕世华; 杨喜旺
2008-01-01
Blind source separation (BBS) technology was applied to vibration signal processing of gearbox for separating different fault vibration sources and enhancing fault information. An improved BSS algorithm based on particle swarm optimization (PSO) was proposed. It can change the traditional fault-enhancing thought based on de-noising. And it can also solve the practical difficult problem of fault location and low fault diagnosis rate in early stage. It was applied to the vibration signal of gearbox under three working states. The result proves that the BSS greatly enhances fault information and supplies technological method for diagnosis of weak fault.
Gearbox Fault Diagnosis in a Wind Turbine Using Single Sensor Based Blind Source Separation
Directory of Open Access Journals (Sweden)
Yuning Qian
2016-01-01
Full Text Available This paper presents a single sensor based blind source separation approach, namely, the wavelet-assisted stationary subspace analysis (WSSA, for gearbox fault diagnosis in a wind turbine. Continuous wavelet transform (CWT is used as a preprocessing tool to decompose a single sensor measurement data into a set of wavelet coefficients to meet the multidimensional requirement of the stationary subspace analysis (SSA. The SSA is a blind source separation technique that can separate the multidimensional signals into stationary and nonstationary source components without the need for independency and prior information of the source signals. After that, the separated nonstationary source component with the maximum kurtosis value is analyzed by the enveloping spectral analysis to identify potential fault-related characteristic frequencies. Case studies performed on a wind turbine gearbox test system verify the effectiveness of the WSSA approach and indicate that it outperforms independent component analysis (ICA and empirical mode decomposition (EMD, as well as the spectral-kurtosis-based enveloping, for wind turbine gearbox fault diagnosis.
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Alvarez, R; Van Saarloos, W; Alvarez, Roberto; Hecke, Martin van; Saarloos, Wim van
1996-01-01
In many pattern forming systems that exhibit traveling waves, sources and sinks occur which separate patches of oppositely traveling waves. We show that simple qualitative features of their dynamics can be compared to predictions from coupled amplitude equations. In heated wire convection experiments, we find a discrepancy between the observed multiplicity of sources and theoretical predictions. The expression for the observed motion of sinks is incompatible with any amplitude equation description.
Yuan, Yalin; Yabe, Mitsuyasu
2014-12-23
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents' preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
Directory of Open Access Journals (Sweden)
Yalin Yuan
2014-12-01
Full Text Available A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Jiangjiang [College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA
2016-08-01
Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose a Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.
Blind source separation of fMRI data by means of factor analytic transformations
Langers, Dave R. M.
2009-01-01
In this study, the application of factor analytic (FA) rotation methods in the context of neuroimaging data analysis was explored. Three FA algorithms (ProMax, QuartiMax, and VariMax) were employed to carry out blind source separation in a functional magnetic resonance imaging (fMRI) experiment that
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
Yang Junan; Li Bin; Zhuang Zhenquan
2003-01-01
This letter proposes two algorithms: a novel Quantum Genetic Algorithm (QGA)based on the improvement of Han's Genetic Quantum Algorithm (GQA) and a new Blind Source Separation (BSS) method based on QGA and Independent Component Analysis (ICA). The simulation result shows that the efficiency of the new BSS method is obviously higher than that of the Conventional Genetic Algorithm (CGA).
Micropollutant removal in an algal treatment system fed with source separated wastewater streams
de Wilt, Arnoud; Butkovskyi, Andrii; Tuantet, Kanjana; Hernandez Leal, Lucia; Fernandes, T.V.; Langenhoff, Alette; Zeeman, Grietje
2016-01-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceu
Fate of personal care and household products in source separated sanitation
Butkovskyi, A.; Rijnaarts, H.H.M.; Zeeman, G.; Hernandez Leal, L.
2016-01-01
Removal of twelve micropollutants, namely biocides, fragrances, ultraviolet (UV)-filters and preservatives in source separated grey and black water treatment systems was studied. All compounds were present in influent grey water in μg/l range. Seven compounds were found in influent black water. T
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2013-01-01
The environmental performance of two pretreatment technologies for source-separated organic waste was compared using life cycle assessment (LCA). An innovative pulping process where source-separated organic waste is pulped with cold water forming a volatile solid rich biopulp was compared to a more...... traditional pretreatment method using a screw press. The inventory of the technologies was constructed including the mass balance, amount of biogas produced, nutrient recovery rates, and produced biomass quality. The technologies were modelled in the LCA-model EASETECH and the potential environmental impacts...... including a number of non-toxic and toxic impact categories were assessed. No big difference in the overall performance of the two technologies was observed. The difference for the separate life cycle steps was, however, more pronounced. More efficient material transfer in the scenario with waste pulping...
Blind separation of sources in nonlinear convolved mixture based on a novel network
Institute of Scientific and Technical Information of China (English)
胡英; 杨杰; 沈利
2004-01-01
Blind separation of independent sources from their nonlinear convoluted mixtures is a more realistic problem than from linear ones. A solution to this problem based on the Entropy Maximization principle is presented. First we propose a novel two-layer network as the de-mixing system to separate sources in nonlinear convolved mixture. In output layer of our network we use feedback network architecture to cope with convoluted mixtures. Then we derive learning algorithms for the two-layer network by maximizing the information entropy. Based on the comparison of the computer simulation results, it can be concluded that the proposed algorithm has a better nonlinear convolved blind signal separation effect than the H.H. Y' s algorithm.
Difficulties applying recent blind source separation techniques to EEG and MEG
Knuth, Kevin H
2015-01-01
High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...
Directory of Open Access Journals (Sweden)
Gang Tang
2016-06-01
Full Text Available In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method.
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....
Yao, Yijun; Shen, Rui; Pennell, Kelly G; Suuberg, Eric M
2013-08-01
Most current vapor-intrusion screening models employ the assumption of a subsurface homogenous source distribution, and groundwater data obtained from nearby monitoring wells are usually taken to reflect the source concentration for several nearby buildings. This practice makes it necessary to consider the possible influence of lateral source-building separation. In this study, a new way to estimate subslab (nonbiodegradable) contaminant concentration is introduced that includes the influence of source offset with the help of a conformal transform technique. Results from this method are compared with those from a three-dimensional numerical model. Based on this newly developed method, a possible explanation is provided here for the great variation in the attenuation factors of the soil vapor concentrations of groundwater-to-subslab contaminants found in the EPA vapor-intrusion database.
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Directory of Open Access Journals (Sweden)
Y. Yokoo
2014-09-01
Full Text Available This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring station on the Abukuma River, Japan. The results of the methods corresponded well in terms of both surface flow components and inter-flow components. In terms of the baseflow component, the result of the time source separation method corresponded with the moving average of the baseflow calculated by the geographic source separation method. These results suggest that the time source separation method is not only able to estimate numerical values for the discharge components, but that the estimates are also reasonable from a geographical viewpoint in the 3000 km2 watershed discussed in this study. The consistent results obtained using the time source and geographic source separation methods demonstrate that it is possible to characterize dominant runoff processes using hourly discharge data, thereby enhancing our capability to interpret the dominant runoff processes of a watershed using observed discharge data alone.
Source-separated urine opens golden opportunities for microbial electrochemical technologies.
Ledezma, Pablo; Kuntke, Philipp; Buisman, Cees J N; Keller, Jürg; Freguia, Stefano
2015-04-01
The food security of a booming global population demands a continuous and sustainable supply of fertilisers. Their current once-through use [especially of the macronutrients nitrogen (N), phosphorus (P), and potassium (K)] requires a paradigm shift towards recovery and reuse. In the case of source-separated urine, efficient recovery could supply 20% of current macronutrient usage and remove 50-80% of nutrients present in wastewater. However, suitable technology options are needed to allow nutrients to be separated from urine close to the source. Thus far none of the proposed solutions has been widely implemented due to intrinsic limitations. Microbial electrochemical technologies (METs) have proved to be technically and economically viable for N recovery from urine, opening the path for novel decentralised systems focused on nutrient recovery and reuse.
Iterative algorithm for joint zero diagonalization with application in blind source separation.
Zhang, Wei-Tao; Lou, Shun-Tian
2011-07-01
A new iterative algorithm for the nonunitary joint zero diagonalization of a set of matrices is proposed for blind source separation applications. On one hand, since the zero diagonalizer of the proposed algorithm is constructed iteratively by successive multiplications of an invertible matrix, the singular solutions that occur in the existing nonunitary iterative algorithms are naturally avoided. On the other hand, compared to the algebraic method for joint zero diagonalization, the proposed algorithm requires fewer matrices to be zero diagonalized to yield even better performance. The extension of the algorithm to the complex and nonsquare mixing cases is also addressed. Numerical simulations on both synthetic data and blind source separation using time-frequency distributions illustrate the performance of the algorithm and provide a comparison to the leading joint zero diagonalization schemes.
Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA
You, Rong-Yi; Chen, Zhong
2005-11-01
Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.
Scale-Adaptive filters for the detection/separation of compact sources
Herranz, D; Barreiro, R B; Martínez-González, E
2002-01-01
This paper presents scale-adaptive filters that optimize the detection/separation of compact sources on a background. We assume that the sources have a multiquadric profile, i. e. $\\tau (x) = {[1 + {(x/r_c)}^2]}^{-\\lambda}, \\lambda \\geq {1/2}, x\\equiv |\\vec{x}|$, and a background modeled by a homogeneous and isotropic random field, characterized by a power spectrum $P(q)\\propto q^{-\\gamma}, \\gamma \\geq 0, q\\equiv |\\vec{q}|$. We make an n-dimensional treatment but consider two interesting astrophysical applications related to clusters of galaxies (Sunyaev-Zel'dovich effect and X-ray emission).
Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA
Institute of Scientific and Technical Information of China (English)
You Rong-Yi; Chen Zhong
2005-01-01
Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
YangJunan; LiBin; 等
2003-01-01
This letter proposes two algorithns:a novel Quantum Genetic Algorithm(QGA)based on the improvement of Han's Genetic Quantum Algorithm(GQA)and a new Blind Source Separation(BSS)method based on QGA and Independent Component Analysis(ICA).The simulation result shows that the efficiency of the new BSS nethod is obviously higher than that of the Conventional Genetic Algorithm(CGA).
Estimating International Tourism Demand to Spain Separately by the Major Source Markets
Marcos Alvarez-Díaz; Manuel González-Gómez; Mª Soledad Otero-Giraldez
2012-01-01
The objective of this paper is to estimate international tourism demand to Spain separately by major source markets (Germany, United Kingdom, France, Italy and The Netherlands) that represent 67% of the international tourism to Spain. In order to investigate how the tourism demand reacts to price and income changes, we apply the bounds testing approach to cointegration and construct confidence intervals using the bootstrap technique. The results show differences in tourism behavior depending ...
[Experimental study on methane potentials of source-separated BMW and individual waste materials].
Feng, Lei; Li, Run-dong; Li, Yan-ji; Ke, Xin; Wei, Li-hong; Luo, Xiao-song
2008-08-01
A laboratory procedure is described for measuring methane potentials of source-separated bio-organic municipal waste (BMW). Triplicate reactors with about 20 grams fresh material were incubated at 37 degrees C with 300 mL inoculum from Shenyang wastewater treatment plant and the methane production was followed over a 50 d period by regular measurement of methane on a gas chromatograph. At 37 degrees C, the methane production efficiency of source-separated BMW and individual waste materials was: starch > BMW > protein > food oil > fat > paper. For the source-separated BMW,starch,protein,food oil,fat and paper, the methane potential (CH4/VS) of 218.15, 209.11, 194.20, 238.86, 257.82 and 131.41 mL/g were found,and ultimate biodegradability of 6 difference materials were 67.73%, 72.88%, 65.84%, 78.38%, 74.11% and 47.98%, respectively.
Cohen, Michael X; Gulbinaite, Rasa
2017-02-15
Steady-state evoked potentials (SSEPs) are rhythmic brain responses to rhythmic sensory stimulation, and are often used to study perceptual and attentional processes. We present a data analysis method for maximizing the signal-to-noise ratio of the narrow-band steady-state response in the frequency and time-frequency domains. The method, termed rhythmic entrainment source separation (RESS), is based on denoising source separation approaches that take advantage of the simultaneous but differential projection of neural activity to multiple electrodes or sensors. Our approach is a combination and extension of existing multivariate source separation methods. We demonstrate that RESS performs well on both simulated and empirical data, and outperforms conventional SSEP analysis methods based on selecting electrodes with the strongest SSEP response, as well as several other linear spatial filters. We also discuss the potential confound of overfitting, whereby the filter captures noise in absence of a signal. Matlab scripts are available to replicate and extend our simulations and methods. We conclude with some practical advice for optimizing SSEP data analyses and interpreting the results.
新家, 健精
2013-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Role of the source to building lateral separation distance in petroleum vapor intrusion
Verginelli, Iason; Capobianco, Oriana; Baciocchi, Renato
2016-06-01
The adoption of source to building separation distances to screen sites that need further field investigation is becoming a common practice for the evaluation of the vapor intrusion pathway at sites contaminated by petroleum hydrocarbons. Namely, for the source to building vertical distance, the screening criteria for petroleum vapor intrusion have been deeply investigated in the recent literature and fully addressed in the recent guidelines issued by ITRC and U.S.EPA. Conversely, due to the lack of field and modeling studies, the source to building lateral distance received relatively low attention. To address this issue, in this work we present a steady-state vapor intrusion analytical model incorporating a piecewise first-order aerobic biodegradation limited by oxygen availability that accounts for lateral source to building separation. The developed model can be used to evaluate the role and relevance of lateral vapor attenuation as well as to provide a site-specific assessment of the lateral screening distances needed to attenuate vapor concentrations to risk-based values. The simulation outcomes showed to be consistent with field data and 3-D numerical modeling results reported in previous studies and, for shallow sources, with the screening criteria recommended by U.S.EPA for the vertical separation distance. Indeed, although petroleum vapors can cover maximum lateral distances up to 25-30 m, as highlighted by the comparison of model outputs with field evidences of vapor migration in the subsurface, simulation results by this new model indicated that, regardless of the source concentration and depth, 6 m and 7 m lateral distances are sufficient to attenuate petroleum vapors below risk-based values for groundwater and soil sources, respectively. However, for deep sources (> 5 m) and for low to moderate source concentrations (benzene concentrations lower than 5 mg/L in groundwater and 0.5 mg/kg in soil) the above criteria were found extremely conservative as
Directory of Open Access Journals (Sweden)
Duofang Chen
2015-01-01
Full Text Available By recording a time series of tomographic images, dynamic fluorescence molecular tomography (FMT allows exploring perfusion, biodistribution, and pharmacokinetics of labeled substances in vivo. Usually, dynamic tomographic images are first reconstructed frame by frame, and then unmixing based on principle component analysis (PCA or independent component analysis (ICA is performed to detect and visualize functional structures with different kinetic patterns. PCA and ICA assume sources are statistically uncorrelated or independent and don’t perform well when correlated sources are present. In this paper, we deduce the relationship between the measured imaging data and the kinetic patterns and present a temporal unmixing approach, which is based on nonnegative blind source separation (BSS method with a convex analysis framework to separate the measured data. The presented method requires no assumption on source independence or zero correlations. Several numerical simulations and phantom experiments are conducted to investigate the performance of the proposed temporal unmixing method. The results indicate that it is feasible to unmix the measured data before the tomographic reconstruction and the BSS based method provides better unmixing quality compared with PCA and ICA.
Energy Technology Data Exchange (ETDEWEB)
Biollaz, S.; Ludwig, Ch.; Stucki, S. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1999-08-01
A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.
Energy Technology Data Exchange (ETDEWEB)
Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I. (Univ. of Pittsburgh, PA); Heintz, Y.J. (Univ. of Pittsburgh, PA); Ilconich, J.B. (Parsons)
2007-06-01
The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.
Blind source separation for groundwater pressure analysis based on nonnegative matrix factorization
Alexandrov, Boian S.; Vesselinov, Velimir V.
2014-09-01
The identification of the physical sources causing spatial and temporal fluctuations of aquifer water levels is a challenging, yet a very important hydrogeological task. The fluctuations can be caused by variations in natural and anthropogenic sources such as pumping, recharge, barometric pressures, etc. The source identification can be crucial for conceptualization of the hydrogeological conditions and characterization of aquifer properties. We propose a new computational framework for model-free inverse analysis of pressure transients based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the subsurface flow medium. Our analysis only requires information about pressure transients at a number of observation points, m, where m≥r, and r is the number of unknown unique sources causing the observed fluctuations. We apply this new analysis on a data set from the Los Alamos National Laboratory site. We demonstrate that the sources identified by NMFk have real physical origins: barometric pressure and water-supply pumping effects. We also estimate the barometric pressure efficiency of the monitoring wells. The possible applications of the NMFk algorithm are not limited to hydrogeology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.
Resonance ionization laser ion sources for on-line isotope separators (invited).
Marsh, B A
2014-02-01
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented.
Non-Cancellation Multistage Kurtosis Maximization with Prewhitening for Blind Source Separation
Directory of Open Access Journals (Sweden)
Xiang Chen
2009-01-01
Full Text Available Chi et al. recently proposed two effective non-cancellation multistage (NCMS blind source separation algorithms, one using the turbo source extraction algorithm (TSEA, called the NCMS-TSEA, and the other using the fast kurtosis maximization algorithm (FKMA, called the NCMS-FKMA. Their computational complexity and performance heavily depend on the dimension of multisensor data, that is, number of sensors. This paper proposes the inclusion of the prewhitening processing in the NCMS-TSEA and NCMS-FKMA prior to source extraction. We come up with four improved algorithms, referred to as the PNCMS-TSEA, the PNCMS-FKMA, the PNCMS-TSEA(p, and the PNCMS-FKMA(p. Compared with the existing NCMS-TSEA and NCMS-FKMA, the former two algorithms perform with significant computational complexity reduction and some performance improvements. The latter two algorithms are generalized counterparts of the former two algorithms with the single source extraction module replaced by a bank of source extraction modules in parallel at each stage. In spite of the same performance of PNCMS-TSEA and PNCMS-TSEA(p (PNCMS-FKMA and PNCMS-FKMA(p, the merit of this parallel source extraction structure lies in much shorter processing latency making the PNCMS-TSEA(p and PNCMS-FKMA(p well suitable for software and hardware implementations. Some simulation results are presented to verify the efficacy and computational efficiency of the proposed algorithms.
Płuska, Mariusz; Czerwinski, Andrzej; Ratajczak, Jacek; Katcki, Jerzy; Oskwarek, Lukasz; Rak, Remigiusz
2009-01-01
The electron-microscope image distortion generated by electromagnetic interference (EMI) is an important problem for accurate imaging in scanning electron microscopy (SEM). Available commercial solutions to this problem utilize sophisticated hardware for EMI detection and compensation. Their efficiency depends on the complexity of distortions influence on SEM system. Selection of a proper method for reduction of the distortions is crucial. The current investigations allowed for a separation of the distortions impact on several components of SEM system. A sum of signals from distortion sources causes wavy deformations of specimen shapes in SEM images. The separation of various reasons of the distortion is based on measurements of the periodic deformations of the images for different electron beam energies and working distances between the microscope final aperture and the specimen. Using the SEM images, a direct influence of alternating magnetic field on the electron beam was distinguished. Distortions of electric signals in the scanning block of SEM were also separated. The presented method separates the direct magnetic field influence on the electron beam below the SEM final aperture (in the chamber) from its influence above this aperture (in the electron column). It also allows for the measurement of magnetic field present inside the SEM chamber. The current investigations gave practical guidelines for selecting the most efficient solution for reduction of the distortions.
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG
Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397
Blind source separation of ex-vivo aorta tissue multispectral images.
Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson
2015-05-01
Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method's performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue.
Ayllón, David; Gil-Pita, Roberto; Rosa-Zurera, Manuel
2013-12-01
A recent trend in hearing aids is the connection of the left and right devices to collaborate between them. Binaural systems can provide natural binaural hearing and support the improvement of speech intelligibility in noise, but they require data transmission between both devices, which increases the power consumption. This paper presents a novel sound source separation algorithm for binaural speech enhancement based on supervised machine learning and time-frequency masking. The system is designed considering the power restrictions in hearing aids, constraining both the computational cost of the algorithm and the transmission bit rate. The transmission schema is optimized using a tailored evolutionary algorithm that assigns a different number of bits to each frequency band. The proposed algorithm requires less than 10% of the available computational resources for signal processing and obtains good separation performance using bit rates lower than 64 kbps.
A SIGNAL-ADAPTIVE ALGORITHM FOR BLIND SEPARATION OF SOURCES WITH MIXED KURTOSIS SIGNS
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
This paper addresses the problem of Blind Source Separation (BSS) and presents a new BSS algorithm with a Signal-Adaptive Activation (SAA) function (SAA-BSS). By taking the sum of absolute values of the normalized kurtoses as a contrast function, the obtained signal-adaptive activation function automatically satisfies the local stability and robustness conditions. The SAA-BSS exploits the natural gradient learning on the Stiefel manifold, and it is an equivariant algorithm with a moderate computational load. Computer simulations show that the SAA-BSS can perform blind separation of mixed sub-Gaussian and super-Gaussian signals and it works more efficiently than the existing algorithms in convergence speed and robustness against outliers.
Rey, Valentine; Rey, Christian
2016-01-01
This article deals with the computation of guaranteed lower bounds of the error in the framework of finite element (FE) and domain decomposition (DD) methods. In addition to a fully parallel computation, the proposed lower bounds separate the algebraic error (due to the use of a DD iterative solver) from the discretization error (due to the FE), which enables the steering of the iterative solver by the discretization error. These lower bounds are also used to improve the goal-oriented error estimation in a substructured context. Assessments on 2D static linear mechanic problems illustrate the relevance of the separation of sources of error and the lower bounds' independence from the substructuring. We also steer the iterative solver by an objective of precision on a quantity of interest. This strategy consists in a sequence of solvings and takes advantage of adaptive remeshing and recycling of search directions.
Natural gradient-based recursive least-squares algorithm for adaptive blind source separation
Institute of Scientific and Technical Information of China (English)
ZHU Xiaolong; ZHANG Xianda; YE Jimin
2004-01-01
This paper focuses on the problem of adaptive blind source separation (BSS).First, a recursive least-squares (RLS) whitening algorithm is proposed. By combining it with a natural gradient-based RLS algorithm for nonlinear principle component analysis (PCA), and using reasonable approximations, a novel RLS algorithm which can achieve BSS without additional pre-whitening of the observed mixtures is obtained. Analyses of the equilibrium points show that both of the RLS whitening algorithm and the natural gradient-based RLS algorithm for BSS have the desired convergence properties. It is also proved that the combined new RLS algorithm for BSS is equivariant and has the property of keeping the separating matrix from becoming singular. Finally, the effectiveness of the proposed algorithm is verified by extensive simulation results.
AN EME BLIND SOURCE SEPARATION ALGORITHM BASED ON GENERALIZED EXPONENTIAL FUNCTION
Institute of Scientific and Technical Information of China (English)
Miao Hao; Li Xiaodong; Tian Jing
2008-01-01
This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid function as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.
Russell, Kellie R; Tedgren, Asa K Carlsson; Ahnesjö, Anders
2005-09-01
In brachytherapy, tissue heterogeneities, source shielding, and finite patient/phantom extensions affect both the primary and scatter dose distributions. The primary dose is, due to the short range of secondary electrons, dependent only on the distribution of material located on the ray line between the source and dose deposition site. The scatter dose depends on both the direct irradiation pattern and the distribution of material in a large volume surrounding the point of interest, i.e., a much larger volume must be included in calculations to integrate many small dose contributions. It is therefore of interest to consider different methods for the primary and the scatter dose calculation to improve calculation accuracy with limited computer resources. The algorithms in present clinical use ignore these effects causing systematic dose errors in brachytherapy treatment planning. In this work we review a primary and scatter dose separation formalism (PSS) for brachytherapy source characterization to support separate calculation of the primary and scatter dose contributions. We show how the resulting source characterization data can be used to drive more accurate dose calculations using collapsed cone superposition for scatter dose calculations. Two types of source characterization data paths are used: a direct Monte Carlo simulation in water phantoms with subsequent parameterization of the results, and an alternative data path built on processing of AAPM TG43 formatted data to provide similar parameter sets. The latter path is motivated of the large amounts of data already existing in the TG43 format. We demonstrate the PSS methods using both data paths for a clinical 192Ir source. Results are shown for two geometries: a finite but homogeneous water phantom, and a half-slab consisting of water and air. The dose distributions are compared to results from full Monte Carlo simulations and we show significant improvement in scatter dose calculations when the collapsed
Blind Source Separation in Farsi Language by Using Hermitian Angle in Convolutive Enviroment
Directory of Open Access Journals (Sweden)
Atefeh Soltani
2013-04-01
Full Text Available This paper presents a T-F masking method for convolutive blind source separation based on hermitian angle concept. The hermitian angle is calculated between T-F domain mixture vector and reference vector. Two different reference vectors are assumed for calculating two different hermitian angles, and then these angles are clustered with k-means or FCM method to estimate unmixing masks. The well-known permutation problem is solved based on k-means clustering of estimated masks which are partitioned to small groups. The experimental results show an improvement in performance when using two different reference vectors compared to only one.
DEFF Research Database (Denmark)
Oh, Geok Lian; Brunskog, Jonas
2014-01-01
as an optimization problem to estimate the unknown position variable using the described physical forward models. The proposed four methodologies are demonstrated and compared using seismic signals recorded by geophones set up on ground surface generated by a surface seismic excitation. The examples show...... that for field data, inversion for localization is most advantageous when the forward model completely describe all the elastic wave components as is the case of the FDTD 3D elastic model.......) elastic wave model to represent the received seismic signal. Two localization algorithms, beamforming and Bayesian inversion, are developed for each physical model. The beam-forming algorithms implemented are the modified time-and-delay beamformer and the F-K beamformer. Inversion is posed...
Generic Uniqueness of a Structured Matrix Factorization and Applications in Blind Source Separation
Domanov, Ignat; Lathauwer, Lieven De
2016-06-01
Algebraic geometry, although little explored in signal processing, provides tools that are very convenient for investigating generic properties in a wide range of applications. Generic properties are properties that hold "almost everywhere". We present a set of conditions that are sufficient for demonstrating the generic uniqueness of a certain structured matrix factorization. This set of conditions may be used as a checklist for generic uniqueness in different settings. We discuss two particular applications in detail. We provide a relaxed generic uniqueness condition for joint matrix diagonalization that is relevant for independent component analysis in the underdetermined case. We present generic uniqueness conditions for a recently proposed class of deterministic blind source separation methods that rely on mild source models. For the interested reader we provide some intuition on how the results are connected to their algebraic geometric roots.
Bernardo, Jose M
2000-01-01
This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica
A study and extension of second-order blind source separation to operational modal analysis
Antoni, J.; Chauhan, S.
2013-02-01
Second-order blind source separation (SOBSS) has gained recent interest in operational modal analysis (OMA), since it is able to separate a set of system responses into modal coordinates from which the system poles can be extracted by single-degree-of-freedom techniques. In addition, SOBSS returns a mixing matrix whose columns are the estimates of the system mode shapes. The objective of this paper is threefold. First, a theoretical analysis of current SOBSS methods is conducted within the OMA framework and its precise conditions of applicability are established. Second, a new separation method is proposed that fixes current limitations of SOBSS: It returns estimate of complex mode shapes, it can deal with more active modes than the number of available sensors, and it shows superior performance in the case of heavily damped and/or strongly coupled modes. Third, a theoretical connection is drawn between SOBSS and stochastic subspace identification (SSI), which stands as one of the points of reference in OMA. All approaches are finally compared by means of numerical simulations.
Separation of Radio-Frequency Sources and Localization of Partial Discharges in Noisy Environments
Directory of Open Access Journals (Sweden)
Guillermo Robles
2015-04-01
Full Text Available The detection of partial discharges (PD can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD. Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO to minimize a distance function.
Chu, T W; Heaven, S; Gredmaier, L
2015-01-01
Source separated food waste is a valuable feedstock for renewable energy production through anaerobic digestion, and a variety of collection schemes for this material have recently been introduced. The aim of this study was to identify options that maximize collection efficiency and reduce fuel consumption as part of the overall energy balance. A mechanistic model was developed to calculate the fuel consumption of kerbside collection of source segregated food waste, co-mingled dry recyclables and residual waste. A hypothetical city of 20,000 households was considered and nine scenarios were tested with different combinations of collection frequencies, vehicle types and waste types. The results showed that the potential fuel savings from weekly and fortnightly co-collection of household waste range from 7.4% to 22.4% and 1.8% to 26.6%, respectively, when compared to separate collection. A compartmentalized vehicle split 30:70 always performed better than one with two compartments of equal size. Weekly food waste collection with alternate weekly collection of the recyclables and residual waste by two-compartment collection vehicles was the best option to reduce the overall fuel consumption.
A Novel Damage Detection Algorithm using Time-Series Analysis-Based Blind Source Separation
Directory of Open Access Journals (Sweden)
A. Sadhu
2013-01-01
Full Text Available In this paper, a novel damage detection algorithm is developed based on blind source separation in conjunction with time-series analysis. Blind source separation (BSS, is a powerful signal processing tool that is used to identify the modal responses and mode shapes of a vibrating structure using only the knowledge of responses. In the proposed method, BSS is first employed to estimate the modal response using the vibration measurements. Time-series analysis is then performed to characterize the mono-component modal responses and successively the resulting time-series models are utilized for one-step ahead prediction of the modal response. With the occurrence of newer measurements containing the signature of damaged system, a variance-based damage index is used to identify the damage instant. Once the damage instant is identified, the damaged and undamaged modal parameters of the system are estimated in an adaptive fashion. The proposed method solves classical damage detection issues including the identification of damage instant, location as well as the severity of damage. The proposed damage detection algorithm is verified using extensive numerical simulations followed by the full scale study of UCLA Factor building using the measured responses under Parkfield earthquake.
Mining nutrients (N, K, P) from urban source-separated urine by forward osmosis dewatering.
Zhang, Jiefeng; She, Qianhong; Chang, Victor W C; Tang, Chuyang Y; Webster, Richard D
2014-03-18
Separating urine from domestic wastewater promotes a more sustainable municipal wastewater treatment system. This study investigated the feasibility of applying a forward osmosis (FO) dewatering process for nutrient recovery from source-separated urine under different conditions, using seawater or desalination brine as a low-cost draw solution. The filtration process with the active layer facing feed solution exhibited relatively high water fluxes up to 20 L/m(2)-h. The process also revealed relatively low rejection to neutral organic nitrogen (urea-N) in fresh urine but improved rejection of ammonium (50-80%) in hydrolyzed urine and high rejection (>90%) of phosphate, potassium in most cases. Compared to simulation based on the solution-diffusion mechanism, higher water flux and solute flux were obtained using fresh or hydrolyzed urine as the feed, which was attributed to the intensive forward nutrient permeation (i.e., of urea, ammonium, and potassium). Membrane fouling could be avoided by prior removal of the spontaneously precipitated crystals in urine. Compared to other urine treatment options, the current process was cost-effective and environmentally friendly for nutrient recovery from urban wastewater at source, yet a comprehensive life-cycle impact assessment might be needed to evaluate and optimize the overall system performance at pilot and full scale operation.
Villalba, Jesús
2015-01-01
In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.
The Doppler Effect based acoustic source separation for a wayside train bearing monitoring system
Zhang, Haibin; Zhang, Shangbin; He, Qingbo; Kong, Fanrang
2016-01-01
Wayside acoustic condition monitoring and fault diagnosis for train bearings depend on acquired acoustic signals, which consist of mixed signals from different train bearings with obvious Doppler distortion as well as background noises. This study proposes a novel scheme to overcome the difficulties, especially the multi-source problem in wayside acoustic diagnosis system. In the method, a time-frequency data fusion (TFDF) strategy is applied to weaken the Heisenberg's uncertainty limit for a signal's time-frequency distribution (TFD) of high resolution. Due to the Doppler Effect, the signals from different bearings have different time centers even with the same frequency. A Doppler feature matching search (DFMS) algorithm is then put forward to locate the time centers of different bearings in the TFD spectrogram. With the determined time centers, time-frequency filters (TFF) are designed with thresholds to separate the acoustic signals in the time-frequency domain. Then the inverse STFT (ISTFT) is taken and the signals are recovered and filtered aiming at each sound source. Subsequently, a dynamical resampling method is utilized to remove the Doppler Effect. Finally, accurate diagnosis for train bearing faults can be achieved by applying conventional spectrum analysis techniques to the resampled data. The performance of the proposed method is verified by both simulated and experimental cases. It shows that it is effective to detect and diagnose multiple defective bearings even though they produce multi-source acoustic signals.
Zwart, Jonathan T L; Jarvis, Matt J
2015-01-01
Measuring radio source counts is critical for characterizing new extragalactic populations, brings a wealth of science within reach and will inform forecasts for SKA and its pathfinders. Yet there is currently great debate (and few measurements) about the behaviour of the 1.4-GHz counts in the microJy regime. One way to push the counts to these levels is via 'stacking', the covariance of a map with a catalogue at higher resolution and (often) a different wavelength. For the first time, we cast stacking in a fully bayesian framework, applying it to (i) the SKADS simulation and (ii) VLA data stacked at the positions of sources from the VIDEO survey. In the former case, the algorithm recovers the counts correctly when applied to the catalogue, but is biased high when confusion comes into play. This needs to be accounted for in the analysis of data from any relatively-low-resolution SKA pathfinders. For the latter case, the observed radio source counts remain flat below the 5-sigma level of 85 microJy as far as 4...
Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.
Wang, Avery Li-Chun
This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters
Pires, Carlos A. L.; Ribeiro, Andreia F. S.
2017-02-01
We develop an expansion of space-distributed time series into statistically independent uncorrelated subspaces (statistical sources) of low-dimension and exhibiting enhanced non-Gaussian probability distributions with geometrically simple chosen shapes (projection pursuit rationale). The method relies upon a generalization of the principal component analysis that is optimal for Gaussian mixed signals and of the independent component analysis (ICA), optimized to split non-Gaussian scalar sources. The proposed method, supported by information theory concepts and methods, is the independent subspace analysis (ISA) that looks for multi-dimensional, intrinsically synergetic subspaces such as dyads (2D) and triads (3D), not separable by ICA. Basically, we optimize rotated variables maximizing certain nonlinear correlations (contrast functions) coming from the non-Gaussianity of the joint distribution. As a by-product, it provides nonlinear variable changes `unfolding' the subspaces into nearly Gaussian scalars of easier post-processing. Moreover, the new variables still work as nonlinear data exploratory indices of the non-Gaussian variability of the analysed climatic and geophysical fields. The method (ISA, followed by nonlinear unfolding) is tested into three datasets. The first one comes from the Lorenz'63 three-dimensional chaotic model, showing a clear separation into a non-Gaussian dyad plus an independent scalar. The second one is a mixture of propagating waves of random correlated phases in which the emergence of triadic wave resonances imprints a statistical signature in terms of a non-Gaussian non-separable triad. Finally the method is applied to the monthly variability of a high-dimensional quasi-geostrophic (QG) atmospheric model, applied to the Northern Hemispheric winter. We find that quite enhanced non-Gaussian dyads of parabolic shape, perform much better than the unrotated variables in which concerns the separation of the four model's centroid regimes
Sukholthaman, Pitchayanin; Sharp, Alice
2016-06-01
Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed.
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms
Directory of Open Access Journals (Sweden)
Pan Qiongfeng
2007-01-01
Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.
Saline sewage treatment and source separation of urine for more sustainable urban water management.
Ekama, G A; Wilsenach, J A; Chen, G H
2011-01-01
While energy consumption and its associated carbon emission should be minimized in wastewater treatment, it has a much lower priority than human and environmental health, which are both closely related to efficient water quality management. So conservation of surface water quality and quantity are more important for sustainable development than green house gas (GHG) emissions per se. In this paper, two urban water management strategies to conserve fresh water quality and quantity are considered: (1) source separation of urine for improved water quality and (2) saline (e.g. sea) water toilet flushing for reduced fresh water consumption in coastal and mining cities. The former holds promise for simpler and shorter sludge age activated sludge wastewater treatment plants (no nitrification and denitrification), nutrient (Mg, K, P) recovery and improved effluent quality (reduced endocrine disruptor and environmental oestrogen concentrations) and the latter for significantly reduced fresh water consumption, sludge production and oxygen demand (through using anaerobic bioprocesses) and hence energy consumption. Combining source separation of urine and saline water toilet flushing can reduce sewer crown corrosion and reduce effluent P concentrations. To realize the advantages of these two approaches will require significant urban water management changes in that both need dual (fresh and saline) water distribution and (yellow and grey/brown) wastewater collection systems. While considerable work is still required to evaluate these new approaches and quantify their advantages and disadvantages, it would appear that the investment for dual water distribution and wastewater collection systems may be worth making to unlock their benefits for more sustainable urban development.
Lucci, Gina M; Nash, David; McDowell, Richard W; Condron, Leo M
2014-07-01
Many factors affect the magnitude of nutrient losses from dairy farm systems. Bayesian Networks (BNs) are an alternative to conventional modeling that can evaluate complex multifactor problems using forward and backward reasoning. A BN of annual total phosphorus (TP) exports was developed for a hypothetical dairy farm in the south Otago region of New Zealand and was used to investigate and integrate the effects of different management options under contrasting rainfall and drainage regimes. Published literature was consulted to quantify the relationships that underpin the BN, with preference given to data and relationships derived from the Otago region. In its default state, the BN estimated loads of 0.34 ± 0.42 kg TP ha for overland flow and 0.30 ± 0.19 kg TP ha for subsurface flow, which are in line with reported TP losses in overland flow (0-1.1 kg TP ha) and in drainage (0.15-2.2 kg TP ha). Site attributes that cannot be managed, like annual rainfall and the average slope of the farm, were found to affect the loads of TP lost from dairy farms. The greatest loads (13.4 kg TP ha) were predicted to occur with above-average annual rainfall (970 mm), where irrigation of farm dairy effluent was managed poorly, and where Olsen P concentrations were above pasture requirements (60 mg kg). Most of this loading was attributed to contributions from overland flow. This study demonstrates the value of using a BN to understand the complex interactions between site variables affecting P loss and their relative importance.
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
This letter deals with the frequency domain Blind Source Separation of Convolutive Mixtures(CMBSS).From the frequency representation of the"overlap and save",a Weighted General Discrete Fourier Transform (WGDFT) is derived to replace the traditional Discrete Fourier Transform (DFT).The mixing matrix on each frequency bin could be estimated more precisely from WGDFT coefficients than from DFT coefficients,which improves separation performance.Simulation results verify the validity of WGDFT for frequency domain blind source separation of convolutive mixtures.
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Jaatinen, Sanna; Kivistö, Anniina; Palmroth, Marja R T; Karp, Matti
2016-09-01
The objective was to demonstrate that a microbial whole cell biosensor, bioluminescent yeast, Saccharomyces cerevisiae (BMAEREluc/ERα) can be applied to detect overall estrogenic activity from fresh and stored human urine. The use of source-separated urine in agriculture removes a human originated estrogen source from wastewater influents, subsequently enabling nutrient recycling. Estrogenic activity in urine should be diminished prior to urine usage in agriculture in order to prevent its migration to soil. A storage period of 6 months is required for hygienic reasons; therefore, estrogenic activity monitoring is of interest. The method measured cumulative female hormone-like activity. Calibration curves were prepared for estrone, 17β-estradiol, 17α- ethinylestradiol and estriol. Estrogen concentrations of 0.29-29,640 μg L(-1) were detectable while limit of detection corresponded to 0.28-35 μg L(-1) of estrogens. The yeast sensor responded well to fresh and stored urine and gave high signals corresponding to 0.38-3,804 μg L(-1) of estrogens in different urine samples. Estrogenic activity decreased during storage, but was still higher than in fresh urine implying insufficient storage length. The biosensor was suitable for monitoring hormonal activity in urine and can be used in screening anthropogenic estrogen-like compounds interacting with the receptor.
Detection of sudden structural damage using blind source separation and time-frequency approaches
Morovati, V.; Kazemi, M. T.
2016-05-01
Seismic signal processing is one of the most reliable methods of detecting the structural damage during earthquakes. In this paper, the use of the hybrid method of blind source separation (BSS) and time-frequency analysis (TFA) is explored to detect the changes in the structural response data. The combination of the BSS and TFA is applied to the seismic signals due to the non-stationary nature of them. Firstly, the second-order blind identification technique is used to decompose the response signal of structural vibration into modal coordinate signals which will be mono-components for TFA. Then each mono-component signal is analyzed to extract instantaneous frequency of structure. Numerical simulations and a real-world seismic-excited structure with time-varying frequencies show the accuracy and robustness of the developed algorithm. TFA of extracted sources shows that used method can be successfully applied to structural damage detection. The results also demonstrate that the combined method can be used to identify the time instant of structural damage occurrence more sharply and effectively than by the use of TFA alone.
System identification through nonstationary data using Time-Frequency Blind Source Separation
Guo, Yanlin; Kareem, Ahsan
2016-06-01
Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the
Directory of Open Access Journals (Sweden)
Hongyun Han
2016-07-01
Full Text Available This paper examines how and to what degree government policies of garbage fees and voluntary source separation programs, with free indoor containers and garbage bags, can affect the effectiveness of municipal solid waste (MSW management, in the sense of achieving a desirable reduction of per capita MSW generation. Based on city-level panel data for years 1998–2012 in China, our empirical analysis indicates that per capita MSW generated is increasing with per capita disposable income, average household size, education levels of households, and the lagged per capita MSW. While both garbage fees and source separation programs have separately led to reductions in per capita waste generation, the interaction of the two policies has resulted in an increase in per capita waste generation due to the following crowding-out effects: Firstly, the positive effect of income dominates the negative effect of the garbage fee. Secondly, there are crowding-out effects of mandatory charging system and the subsidized voluntary source separation on per capita MSW generation. Thirdly, small subsidies and tax punishments have reduced the intrinsic motivation for voluntary source separation of MSW. Thus, compatible fee charging system, higher levels of subsidies, and well-designed public information and education campaigns are required to promote household waste source separation and reduction.
Bench-scale composting of source-separated human faeces for sanitation.
Niwagaba, C; Nalubega, M; Vinnerås, B; Sundberg, C; Jönsson, H
2009-02-01
In urine-diverting toilets, urine and faeces are collected separately so that nutrient content can be recycled unmixed. Faeces should be sanitized before use in agriculture fields due to the presence of possible enteric pathogens. Composting of human faeces with food waste was evaluated as a possible method for this treatment. Temperatures were monitored in three 78-L wooden compost reactors fed with faeces-to-food waste substrates (F:FW) in wet weight ratios of 1:0, 3:1 and 1:1, which were observed for approximately 20 days. To achieve temperatures higher than 15 degrees C above ambient, insulation was required for the reactors. Use of 25-mm thick styrofoam insulation around the entire exterior of the compost reactors and turning of the compost twice a week resulted in sanitizing temperatures (>or=50 degrees C) to be maintained for 8 days in the F:FW=1:1 compost and for 4 days in the F:FW=3:1 compost. In these composts, a reduction of >3 log(10) for E. coli and >4 log(10) for Enterococcus spp. was achieved. The F:FW=1:0 compost, which did not maintain >or=50 degrees C for a sufficiently long period, was not sanitized, as the counts of E. coli and Enterococcus spp. increased between days 11 and 15. This research provides useful information on the design and operation of family-size compost units for the treatment of source-separated faeces and starchy food residues, most likely available amongst the less affluent rural/urban society in Uganda.
Brewick, P. T.; Smyth, A. W.
2014-12-01
The accurate and reliable estimation of modal damping from output-only vibration measurements of structural systems is a continuing challenge in the fields of operational modal analysis (OMA) and system identification. In this paper a modified version of the blind source separation (BSS)-based Second-Order Blind Identification (SOBI) method was used to perform modal damping identification on a model bridge structure under varying loading conditions. The bridge model was created with finite elements and consisted of a series of stringer beams supported by a larger girder. The excitation was separated into two categories: ambient noise and traffic loads with noise modeled with random forcing vectors and traffic simulated with moving loads for cars and partially distributed moving masses for trains. The acceleration responses were treated as the mixed output signals for the BSS algorithm. The modified SOBI method used a windowing technique to maximize the amount of information used for blind identification from the responses. The modified SOBI method successfully found the mode shapes for both types of excitation with strong accuracy, but power spectral densities (PSDs) of the recovered modal responses showed signs of distortion for the traffic simulations. The distortion had an adverse affect on the damping ratio estimates for some of the modes but no correlation could be found between the accuracy of the damping estimates and the accuracy of the recovered mode shapes. The responses and their PSDs were compared to real-world collected data and patterns similar to distortion were observed implying that this issue likely affects real-world estimates.
Wright, L.; Coddington, O.; Pilewskie, P.
2015-12-01
Current challenges in Earth remote sensing require improved instrument spectral resolution, spectral coverage, and radiometric accuracy. Hyperspectral instruments, deployed on both aircraft and spacecraft, are a growing class of Earth observing sensors designed to meet these challenges. They collect large amounts of spectral data, allowing thorough characterization of both atmospheric and surface properties. The higher accuracy and increased spectral and spatial resolutions of new imagers require new numerical approaches for processing imagery and separating surface and atmospheric signals. One potential approach is source separation, which allows us to determine the underlying physical causes of observed changes. Improved signal separation will allow hyperspectral instruments to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. In this work, we investigate a Non-negative Matrix Factorization (NMF) method for the separation of atmospheric and land surface signal sources. NMF offers marked benefits over other commonly employed techniques, including non-negativity, which avoids physically impossible results, and adaptability, which allows the method to be tailored to hyperspectral source separation. We adapt our NMF algorithm to distinguish between contributions from different physically distinct sources by introducing constraints on spectral and spatial variability and by using library spectra to inform separation. We evaluate our NMF algorithm with simulated hyperspectral images as well as hyperspectral imagery from several instruments including, the NASA Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), NASA Hyperspectral Imager for the Coastal Ocean (HICO) and National Ecological Observatory Network (NEON) Imaging Spectrometer.
DEFF Research Database (Denmark)
Pires, Sara Monteiro; Hald, Tine
2010-01-01
Salmonella is a major cause of human gastroenteritis worldwide. To prioritize interventions and assess the effectiveness of efforts to reduce illness, it is important to attribute salmonellosis to the responsible sources. Studies have suggested that some Salmonella subtypes have a higher health...
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Haraldsen, Trond Knapp; Andersen, Uno; Krogstad, Tore; Sørheim, Roald
2011-12-01
This study examined the efficiency of different organic waste materials as NPK fertilizer, in addition to the risk for leaching losses related to shower precipitation in the first part of the growing season. The experiment was tested in a pot trial on a sandy soil in a greenhouse. Six organic fertilizers were evaluated: liquid anaerobic digestate (LAD) sourced from separated household waste, nitrified liquid anaerobic digestate (NLAD) of the same origin as LAD, meat and bone meal (MBM), hydrolysed salmon protein (HSP), reactor-composted catering waste (CW) and cattle manure (CM). An unfertilized control, calcium nitrate (CN) and Fullgjødsel® 21-4-10 were used as reference fertilizers. At equal amounts of mineral nitrogen both LAD and Fullgjødsel® gave equal yield of barley in addition to equal uptake of N, P, and K in barley grain. NLAD gave significantly lower barley yield than the original LAD due to leaching of nitrate-N after a simulated surplus of precipitation (28 mm) at Zadoks 14. There was significantly increased leaching of nitrate N from the treatments receiving 160 kg N ha(-1) of CN and NLAD in comparison with all the other organic fertilizers. In this study LAD performed to the same degree as Fullgjødsel® NPK fertilizer and it was concluded that LAD can be recommended as fertilizer for cereals. Nitrification of the ammonium N in the digestate caused significantly increased nitrate leaching, and cannot be recommended.
Efficient source separation algorithms for acoustic fall detection using a microsoft kinect.
Li, Yun; Ho, K C; Popescu, Mihail
2014-03-01
Falls have become a common health problem among older adults. In previous study, we proposed an acoustic fall detection system (acoustic FADE) that employed a microphone array and beamforming to provide automatic fall detection. However, the previous acoustic FADE had difficulties in detecting the fall signal in environments where interference comes from the fall direction, the number of interferences exceeds FADE's ability to handle or a fall is occluded. To address these issues, in this paper, we propose two blind source separation (BSS) methods for extracting the fall signal out of the interferences to improve the fall classification task. We first propose the single-channel BSS by using nonnegative matrix factorization (NMF) to automatically decompose the mixture into a linear combination of several basis components. Based on the distinct patterns of the bases of falls, we identify them efficiently and then construct the interference free fall signal. Next, we extend the single-channel BSS to the multichannel case through a joint NMF over all channels followed by a delay-and-sum beamformer for additional ambient noise reduction. In our experiments, we used the Microsoft Kinect to collect the acoustic data in real-home environments. The results show that in environments with high interference and background noise levels, the fall detection performance is significantly improved using the proposed BSS approaches.
Edwards, Joel; Othman, Maazuza; Burn, Stewart; Crossin, Enda
2016-10-01
The collection of source separated kerbside municipal FW (SSFW) is being incentivised in Australia, however such a collection is likely to increase the fuel and time a collection truck fleet requires. Therefore, waste managers need to determine whether the incentives outweigh the cost. With literature scarcely describing the magnitude of increase, and local parameters playing a crucial role in accurately modelling kerbside collection; this paper develops a new general mathematical model that predicts the energy and time requirements of a collection regime whilst incorporating the unique variables of different jurisdictions. The model, Municipal solid waste collect (MSW-Collect), is validated and shown to be more accurate at predicting fuel consumption and trucks required than other common collection models. When predicting changes incurred for five different SSFW collection scenarios, results show that SSFW scenarios require an increase in fuel ranging from 1.38% to 57.59%. There is also a need for additional trucks across most SSFW scenarios tested. All SSFW scenarios are ranked and analysed in regards to fuel consumption; sensitivity analysis is conducted to test key assumptions.
Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface
Directory of Open Access Journals (Sweden)
Michael H. Thaut
2005-11-01
Full Text Available Most EEG-based BCI systems make use of well-studied patterns of brain activity. However, those systems involve tasks that indirectly map to simple binary commands such as Ã¢Â€ÂœyesÃ¢Â€Â or Ã¢Â€ÂœnoÃ¢Â€Â or require many weeks of biofeedback training. We hypothesized that signal processing and machine learning methods can be used to discriminate EEG in a direct Ã¢Â€ÂœyesÃ¢Â€Â/Ã¢Â€ÂœnoÃ¢Â€Â BCI from a single session. Blind source separation (BSS and spectral transformations of the EEG produced a 180-dimensional feature space. We used a modified genetic algorithm (GA wrapped around a support vector machine (SVM classifier to search the space of feature subsets. The GA-based search found feature subsets that outperform full feature sets and random feature subsets. Also, BSS transformations of the EEG outperformed the original time series, particularly in conjunction with a subset search of both spaces. The results suggest that BSS and feature selection can be used to improve the performance of even a Ã¢Â€Âœdirect,Ã¢Â€Â single-session BCI.
Insects associated with the composting process of solid urban waste separated at the source
Directory of Open Access Journals (Sweden)
Gladis Estela Morales
2010-01-01
Full Text Available Sarcosaprophagous macroinvertebrates (earthworms, termites and a number of Diptera larvae enhance changes in the physical and chemical properties of organic matter during degradation and stabilization processes in composting, causing a decrease in the molecular weights of compounds. This activity makes these organisms excellent recyclers of organic matter. This article evaluates the succession of insects associated with the decomposition of solid urban waste separated at the source. The study was carried out in the city of Medellin, Colombia. A total of 11,732 individuals were determined, belonging to the classes Insecta and Arachnida. Species of three orders of Insecta were identified, Diptera, Coleoptera and Hymenoptera. Diptera corresponding to 98.5% of the total, was the most abundant and diverse group, with 16 families (Calliphoridae, Drosophilidae, Psychodidae, Fanniidae, Muscidae, Milichiidae, Ulidiidae, Scatopsidae, Sepsidae, Sphaeroceridae, Heleomyzidae, Stratiomyidae, Syrphidae, Phoridae, Tephritidae and Curtonotidae followed by Coleoptera with five families (Carabidae, Staphylinidae, Ptiliidae, Hydrophilidae and Phalacaridae. Three stages were observed during the composting process, allowing species associated with each stage to be identified. Other species were also present throughout the whole process. In terms of number of species, Diptera was the most important group observed, particularly Ornidia obesa, considered a highly invasive species, and Hermetia illuscens, both reported as beneficial for decomposition of organic matter.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.
Fehr, M
2014-09-01
Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills.
DEFF Research Database (Denmark)
Hansen, Trine Lund; Svärd, Å; Angelidaki, Irini
2003-01-01
A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical composit......A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical...
Arendt, C. A.; Aciego, S.; Hetland, E.
2015-12-01
Processes that drive glacial ablation directly impact surrounding ecosystems and communities that are dependent on glacial meltwater as a freshwater reservoir: crucially, freshwater runoff from alpine and Arctic glaciers has large implications for watershed ecosystems and contingent economies. Furthermore, glacial hydrology processes are a complex and fundamental part of understanding high-latitude environments in the modern and predicting how they might change in the future. Specifically, developing better estimates of the origin of freshwater discharge, as well as the duration and amplitude of extreme melting and precipitation events, could provide crucial constraints on these processes and allow for glacial watershed systems to be modeled more effectively. In order to investigate the universality of the temporal and spatial melt relationships that exist in glacial systems, I investigate the isotopic composition of glacial meltwater and proximal seawater including stable isotopes δ18O and δD, which have been measured in glacial water samples I collected from the alpine Athabasca Glacier in the Canadian Rockies. This abstract is focused on extrapolating the relative contributions of meltwater sources - snowmelt, ice melt, and summer precipitation - using a coupled statistical-chemical model (Arendt et al., 2015). I apply δ18O and δD measurements of Athabasca Glacier subglacial water samples to a Bayesian Monte Carlo (BMC) estimation scheme. Importantly, this BMC model also assesses the uncertainties associated with these meltwater fractional contribution estimations, which provides an assessment of how well the system is constrained. By defining the proportion of overall melt that is coming from snow versus ice using stable isotopes, the volume of water generated by ablation can be calculated. This water volume has two important implications. First, communities that depend on glacial water for aquifer recharge can start assessing future water resources, as
Institute of Scientific and Technical Information of China (English)
Kazi Takpaya; Wei Gang
2003-01-01
Blind identification-blind equalization for Finite Impulse Response (FIR) Multiple Input-Multiple Output (MIMO) channels can be reformulated as the problem of blind sources separation. It has been shown that blind identification via decorrelating sub-channels method could recover the input sources. The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators, which decorrelate the output signals of subchannels, and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix. In this paper, a new approximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed. The proposed method outperforms BIDS in the presence of additive white Gaussian noise.
Institute of Scientific and Technical Information of China (English)
AaziTakpaya; WeiGang
2003-01-01
Blind identification-blind equalization for finite Impulse Response(FIR)Multiple Input-Multiple Output(MIMO)channels can be reformulated as the problem of blind sources separation.It has been shown that blind identification via decorrelating sub-channels method could recover the input sources.The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators,which decorrelate the output signals of subchannels,and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix.In this paper,a new qpproximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed.The proposed method outperforms BIDS in the presence of additive white Garssian noise.
Source separation for materials recovery in Norway; Kildesortering gir oss god samvittighet
Energy Technology Data Exchange (ETDEWEB)
Bjerk, Jan
2003-07-01
In 1995, the Norwegian Ministry of the Environment entered into an agreement with the industries about collection of packaging waste. Since then almost a ''waste revolution'' has occurred in Norway. The fraction of household waste that was sorted out prior to deposition increased from 9 per cent in 1992 to 44 per cent in 2001. But this has been achieved only by extensive use of fees and of information campaigns towards municipalities, industry and consumers. Much of the information has come from the individual materials companies, but more general information about waste and recycling has been channelled through a foundation called Loop, which was established in 2000. For example, in cooperation with NRF - Norsk Renholdsverks Forening - (The Norwegian Association of Solid Waste Management ) Loop has established the Loop Miljoeskole (Loop Environmental School). This ''school'' consists of booklets that are aimed at school children. In addition, primary schools using the booklets are invited to visit local firms. This ''school'' has become a formidable success even less than half a year after its establishment. Essentially the goal set for source separation has been reached and the extra costs of increasing the collection percentage by one are high. Probably the industries are not willing to spend a lot more than the half-billion (NOK) they already have the last years. Not all is well: waste is left ''everywhere'' and more than 400 mill NOK is used every year to tidy up. The Public Roads Administration alone spends over 100 mill NOK in tidying up at stopping places and roadsides per year. In 2004, Loop and NRF will revive a campaign from the 1960s: ''Keep Norway Clean''.
Complete nutrient recovery from source-separated urine by nitrification and distillation.
Udert, K M; Wächter, M
2012-02-01
In this study we present a method to recover all nutrients from source-separated urine in a dry solid by combining biological nitrification with distillation. In a first process step, a membrane-aerated biofilm reactor was operated stably for more than 12 months, producing a nutrient solution with a pH between 6.2 and 7.0 (depending on the pH set-point), and an ammonium to nitrate ratio between 0.87 and 1.15 gN gN(-1). The maximum nitrification rate was 1.8 ± 0.3 gN m(-2) d(-1). Process stability was achieved by controlling the pH via the influent. In the second process step, real nitrified urine and synthetic solutions were concentrated in lab-scale distillation reactors. All nutrients were recovered in a dry powder except for some ammonia (less than 3% of total nitrogen). We estimate that the primary energy demand for a simple nitrification/distillation process is four to five times higher than removing nitrogen and phosphorus in a conventional wastewater treatment plant and producing the equivalent amount of phosphorus and nitrogen fertilizers. However, the primary energy demand can be reduced to values very close to conventional treatment, if 80% of the water is removed with reverse osmosis and distillation is operated with vapor compression. The ammonium nitrate content of the solid residue is below the limit at which stringent EU safety regulations for fertilizers come into effect; nevertheless, we propose some additional process steps that will increase the thermal stability of the solid product.
Estimation of nitrite in source-separated nitrified urine with UV spectrophotometry.
Mašić, Alma; Santos, Ana T L; Etter, Bastian; Udert, Kai M; Villez, Kris
2015-11-15
Monitoring of nitrite is essential for an immediate response and prevention of irreversible failure of decentralized biological urine nitrification reactors. Although a few sensors are available for nitrite measurement, none of them are suitable for applications in which both nitrite and nitrate are present in very high concentrations. Such is the case in collected source-separated urine, stabilized by nitrification for long-term storage. Ultraviolet (UV) spectrophotometry in combination with chemometrics is a promising option for monitoring of nitrite. In this study, an immersible in situ UV sensor is investigated for the first time so to establish a relationship between UV absorbance spectra and nitrite concentrations in nitrified urine. The study focuses on the effects of suspended particles and saturation on the absorbance spectra and the chemometric model performance. Detailed analysis indicates that suspended particles in nitrified urine have a negligible effect on nitrite estimation, concluding that sample filtration is not necessary as pretreatment. In contrast, saturation due to very high concentrations affects the model performance severely, suggesting dilution as an essential sample preparation step. However, this can also be mitigated by simple removal of the saturated, lower end of the UV absorbance spectra, and extraction of information from the secondary, weaker nitrite absorbance peak. This approach allows for estimation of nitrite with a simple chemometric model and without sample dilution. These results are promising for a practical application of the UV sensor as an in situ nitrite measurement in a urine nitrification reactor given the exceptional quality of the nitrite estimates in comparison to previous studies.
Source-separated municipal solid waste compost application to Swiss chard and basil.
Zheljazkov, Valtcho D; Warman, Philip R
2004-01-01
A growth room experiment was conducted to evaluate the bioavailability of Cu, Mn, Zn, Ca, Fe, K, Mg, P, S, As, B, Cd, Co, Cr, Hg, Mo, Na, Ni, Pb, and Se from a sandy loam soil amended with source-separated municipal solid waste (SSMSW) compost. Basil (Ocimum basilicum L.) and Swiss chard (Beta vulgaris L.) were amended with 0, 20, 40, and 60% SSMSW compost to soil (by volume) mixture. Soils and compost were sequentially extracted to fractionate Cu, Pb, and Zn into exchangeable (EXCH), iron- and manganese-oxide-bound (FeMnOX), organic-matter (OM), and structurally bound (SB) forms. Overall, in both species, the proportion of Cu, Pb, and Zn levels in different fractions followed the sequence: SB > OM > FeMnOX > EXCH for Cu; FeMnOX = SB > OM > EXCH for Pb; and FeMnOX > SB = EXCH > OM for Zn. Application of SSMSW compost increased soil pH and electrical conductivity (EC), and increased the concentration of Cu, Pb, and Zn in all fractions, but not EXCH Pb. Basil yields were greatest in the 20% treatment, but Swiss chard yields were greater in all compost-amended soils relative to the unamended soil. Basil plants in 20 or 40% compost treatments reached flowering earlier than plants from other treatments. Additions of SSMSW compost to soil altered basil essential oil, but basil oil was free of metals. The results from this study suggest that mature SSMSW compost with concentrations of Cu, Pb, Mo, and Zn of 311, 223, 17, and 767 mg/kg, respectively, could be used as a soil conditioner without phytotoxic effects on agricultural crops and without increasing the normal range of Cu, Pb, and Zn in crop tissue. However, the long-term effect of the accumulation of heavy metals in soils needs to be carefully considered.
Zamora, Patricia; Georgieva, Tanya; Salcedo, Inmaculada; Elzinga, Nico; Kuntke, Philipp; Buisman, Cees J.N.
2016-01-01
BACKGROUND: The treatment of source separated urine allows for a more sustainable approach to nutrients recovery in actual wastewater treatment plants. Struvite precipitation from urine yields a slow-release fertilizer (struvite) with a high marketable value for agricultural use. Extensive resear
Zeeman, G.; Kujawa, K.; Mes, de T.Z.D.; Graaff, de M.S.; Abu-Ghunmi, L.N.A.H.; Mels, A.R.; Meulman, B.; Temmink, B.G.; Buisman, C.J.N.; Lier, van J.B.; Lettinga, G.
2008-01-01
Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas product
Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik
2011-03-01
Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems.
Directory of Open Access Journals (Sweden)
Saswati Swapna Dash
2014-07-01
Full Text Available This paper presents an overall study of Feedback Control of Z-Source Converter Fed Separately excited DC motor with centrifugal Pump Set. Z-source converter can be used for both voltage buck and boost mode using LC impedance network. In this paper the dynamic modeling of Z-source with motor load and centrifugal pump set is carried out with new findings. The compensators for speed feedback loop are designed by taking average state space analysis and small signal model of the system. The feedback loop is designed by classical control methods. The experiment is done in MATLAB work environment and the result is verified by Simulation.
Bayesian stable isotope mixing models
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...
Single channel source separation of radar fuze mixed signal based on phase difference analysis
Institute of Scientific and Technical Information of China (English)
Hang ZHU; Shu-ning ZHANG; Hui-chang ZHAO
2014-01-01
A new method based on phase difference analysis is proposed for the single-channel mixed signal separation of single-channel radar fuze. This method is used to estimate the mixing coefficients of de-noised signals through the cumulants of mixed signals, solve the candidate data set by the mixing coefficients and signal analytical form, and resolve the problem of vector ambiguity by analyzing the phase differences. The signal separation is realized by exchanging data of the solutions. The waveform similarity coefficients are calculated, and the timeefrequency dis-tributions of separated signals are analyzed. The results show that the proposed method is effective.
Anaerobic treatment in decentralised and source-separation-based sanitation concepts
Kujawa-Roeleveld, K.; Zeeman, G.
2006-01-01
Anaerobic digestion of wastewater should be a core technology employed in decentralised sanitation systems especially when their objective is also resource conservation and reuse. The most efficient system involves separate collection and anaerobic digestion of the most concentrated domestic wastewa
Xu, Kangning; Wang, Chengwen; Zheng, Min; Yuan, Xin
2010-11-01
This study aimed to construct an on-site eco-sewerage system for modern office buildings in urban area based on combined innovative technologies of vacuum and source-separation. Results showed that source-separated grey water had low concentrations of pollutants, which helped the reuse of grey water. However, the system had a low separation efficiency between the yellow water and the brown water, which was caused by the plug problem in the urine collection from the urine-diverting toilets. During the storage of yellow water for liquid fertilizer production, nearly all urea nitrogen transferred to ammonium nitrogen and about 2/3 phosphorus was lost because of the struvite precipitation. Total bacteria and coliforms increased first in the storage, but then decreased to low concentrations. The anaerobic/anoxic/aerobic MBR had high elimination rates of COD, ammonium nitrogen and total nitrogen of the brown water, which were 94.2%, 98.1% and 95.1%, respectively. However, the effluent still had high contents of colority, nitrate and phosphorus, which affected the application of the effluent for flushing water. Even though, the effluent might be used as dilution water for the yellow water fertilizer. Based on the results and the assumption of an ideal operation of the vacuum source-separation system, a future plan for on-site eco-sewerage system of modern office buildings was constructed. Its sustainability was validated by the analysis of the substances flow of water and nutrients.
Introduction to Bayesian statistics
Bolstad, William M
2017-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
DEFF Research Database (Denmark)
Fernandez Grande, Efren; Jacobsen, Finn
2010-01-01
A method of estimating the sound field radiated by a source under non-anechoic conditions has been examined. The method uses near field acoustic holography based on a combination of pressure and particle velocity measurements in a plane near the source for separating outgoing and ingoing wave...... components. The outgoing part of the sound field is composed of both radiated and scattered waves. The method compensates for the scattered components of the outgoing field on the basis of the boundary condition of the problem, exploiting the fact that the sound field is reconstructed very close...... to the source. Thus the radiated free-field component is estimated simultaneously with solving the inverse problem of reconstructing the sound field near the source. The method is particularly suited to cases in which the overall contribution of reflected sound in the measurement plane is significant....
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Larsen, Bjarne
2016-01-01
A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed......) to the produced biomass. The data generated in this study could be used for the environmental assessment of the technology and thus help in selecting the best pre-treatment technology for source separated organic household waste....... screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered...
Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study
Tervahauta, T.H.; Trang Hoang,; Hernández, L.; Zeeman, G.; Buisman, C.J.N.
2013-01-01
Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data
Vrins, Frédéric
2007-01-01
In the recent years, Independent Component Analysis (ICA) has become a fundamental tool in adaptive signal and data processing, especially in the field of Blind Source Separation (BSS). Even though there exist some methods for which an algebraic solution to the ICA problem may be found, other iterative methods are very popular. Among them is the class of information-theoretic approaches, laying on entropies. The associated objective functions are maximized based on optimization schemes, and o...
Blind Source Separation Algorithm Based on Correntropy%基于相关熵的盲源分离算法
Institute of Scientific and Technical Information of China (English)
成昊; 唐斌
2013-01-01
A blind source separation algorithm based on correntropy is presented. Unlike the traditional independent component analysis (ICA) method which utilizes the forth-order statistics or temporal structure to achieve the blind source separation. This algorithm is motivated from the notion of correntropy in the information theoretic learning, utilizing the even statistics implied in correntropy. The cost function is established according to the relationship between the parametric centered correntropy and the independence measure, and then minimized by using the optimization algorithm to acquire the demixing matrix and separate the signal. Simulations show that the performance is better than the traditional ICA method when separating the mixture of the super-Gaussian source and sub-Gaussian source.% 提出了基于相关熵的盲源分离算法。与传统独立成分分析(ICA)方法利用四阶统计量或时间结构的盲源分离不同，该算法从信息理论学习中的相关熵概念出发，利用相关熵中蕴涵的各偶数阶统计信息，通过参数化中心相关熵与独立性测度的关系，建立代价函数，并通过优化算法对其进行寻优，从而得到解混矩阵并分离出源信号。仿真结果表明，在分离超高斯混合源和次高斯混合源时，分离性能优于传统的ICA方法。
Directory of Open Access Journals (Sweden)
Huaizhi Su
2015-01-01
Full Text Available Distributed temperature sensing (DTS provides an important technology support for the earth-rock junctions of dike projects (ERJD, which are binding sites between culvert, gates, and pipes and dike body and dike foundation. In this study, a blind source separation model is used for the identification of leakages based on the temperature data of DTS in leakage monitoring of ERJD. First, a denoising method is established based on the temperature monitoring data of distributed optical fiber in ERJD by a wavelet packet signal decomposition technique. The temperature monitoring messages of fibers are combined response for leakages and other factors. Its character of unclear responding mechanism is very obvious. Thus, a blind source separation technology is finally selected. Then, the rule of temperature measurement data for optical fiber is analyzed and its temporal and spatial change process is also discussed. The realization method of the blind source separation model is explored by combining independent component analysis (ICA with principal component analysis (PCA. The practical test result in an example shows that the method could efficiently locate and identify the leakage location of ERJD. This paper is expected to be useful for further scientific research and efficient applications of distributed optical fiber sensing technology.
Bai, Mingsian R; Yao, Yueh Hua; Lai, Chang-Sheng; Lo, Yi-Yang
2016-03-01
In this paper, four delay-and-sum (DAS) beamformers formulated in the modal domain and the space domain for open and solid spherical apertures are examined through numerical simulations. The resulting beampatterns reveal that the mainlobe of the solid spherical DAS array is only slightly narrower than that of the open array, whereas the sidelobes of the modal domain array are more significant than those of the space domain array due to the discrete approximation of continuous spherical Fourier transformation. To verify the theory experimentally, a three-dimensionally printed spherical array on which 32 micro-electro-mechanical system microphones are mounted is utilized for localization and separation of sound sources. To overcome the basis mismatch problem in signal separation, source localization is first carried out using minimum variance distortionless response beamformer. Next, Tikhonov regularization (TIKR) and compressive sensing (CS) are employed to extract the source signal amplitudes. Simulations and experiments are conducted to validate the proposed spherical array system. Objective perceptual evaluation of speech quality test and a subjective listening test are undertaken in performance evaluation. The experimental results demonstrate better separation quality achieved by the CS approach than by the TIKR approach at the cost of computational complexity.
Applied Bayesian Hierarchical Methods
Congdon, Peter D
2010-01-01
Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.
Kahlon, Arshdeep S; Periyalwar, Shalini; Yanikomeroglu, Halim
2012-01-01
We show that, for independent interfering sources and a signal link with exponentially distributed received power, the total probability of outage can be decomposed as a simple expression of the outages from the individual interfering sources. We give a mathematical proof of this result, and discuss some immediate implications, showing how it results in important simplifications to statistical outage analysis. We also discuss its application to two active topics of study: spectrum sharing, and sum of interference powers (e.g., lognormal) analysis.
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Isotope separation of the Yb-168 stable isotope for low energy gamma ray sources
Energy Technology Data Exchange (ETDEWEB)
Park, Hyun Min; Kwon, Duck Hee; Cha, Yong Ho; Lee, Ki Tae; Nam, Sung Mo; Yoo, Jaek Won; Han, Jae Min; Rhee, Yong Joo [Lab. of Quantum Optics, Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)
2003-07-01
We developed laser isotope separation technology of stable isotope of low melting point metals. Yb-168 can be effectively used in non-destructive testing (NDT) after it is transformed to Yb-168 by neutron irradiation in a nuclear reactor. For this application of Yb-168, the isotope purity of it should be enhanced to more than 15% from the natural abundance of 0.135%. Our isotope separation system consist of laser system, Yb vapor generating system, and photoionized particle extraction system. For the system, we developed a diode-pumped slid-state laser of high-repetition rate and 3-color dye lasers. Yb vapor was generated by heating solid Yb sample resistively. The photo-ion produced by resonance ionization were extracted by a devised extractor. We produced enriched Yb metal more than 20 mg with the abundance of 25.8% of Yb-168 in the Yb (NO{sub 3}){sub 3}.
Quednau, Philipp; Trommer, Ralph; Schmidt, Lorenz-Peter
2016-03-01
Wireless transmission systems in smart metering networks share the advantage of lower installation costs due to the expandability of separate infrastructure but suffer from transmission problems. In this paper the issue of interference of wireless transmitted smart meter data with third party systems and data from other meters is investigated and an approach for solving the problem is presented. A multi-channel wireless m-bus receiver was developed to separate the desired data from unwanted interferers by spatial filtering. The according algorithms are presented and the influence of different antenna types on the spatial filtering is investigated. The performance of the spatial filtering is evaluated by extensive measurements in a realistic surrounding with several hundreds of active wireless m-bus transponders. These measurements correspond to the future environment for data-collectors as they took place in rural and urban areas with smart gas meters equipped with wireless m-bus transponders installed in almost all surrounding buildings.
Hajipour Sardouie, Sepideh; Bagher Shamsollahi, Mohammad; Albera, Laurent; Merlet, Isabelle
2015-05-01
Removing muscle activity from ictal ElectroEncephaloGram (EEG) data is an essential preprocessing step in diagnosis and study of epileptic disorders. Indeed, at the very beginning of seizures, ictal EEG has a low amplitude and its morphology in the time domain is quite similar to muscular activity. Contrary to the time domain, ictal signals have specific characteristics in the time-frequency domain. In this paper, we use the time-frequency signature of ictal discharges as a priori information on the sources of interest. To extract the time-frequency signature of ictal sources, we use the Canonical Correlation Analysis (CCA) method. Then, we propose two time-frequency based semi-blind source separation approaches, namely the Time-Frequency-Generalized EigenValue Decomposition (TF-GEVD) and the Time-Frequency-Denoising Source Separation (TF-DSS), for the denoising of ictal signals based on these time-frequency signatures. The performance of the proposed methods is compared with that of CCA and Independent Component Analysis (ICA) approaches for the denoising of simulated ictal EEGs and of real ictal data. The results show the superiority of the proposed methods in comparison with CCA and ICA.
Gaussian Process Based Independent Analysis for Temporal Source Separation in fMRI.
Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole
2017-02-26
Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts the exploratory nature of the analysis. Fully unsupervised independent component analysis (ICA) algorithms, on the other hand, can struggle to detect clear classifiable components on single-subject data. We attribute this shortcoming to inadequate modeling of the fMRI source signals by failing to incorporate its temporal nature. fMRI source signals, biological stimuli and non-stimuli-related artifacts are all smooth over a time-scale compatible with the sampling time (TR). We therefore propose Gaussian process ICA (GPICA), which facilitates temporal dependency by the use of Gaussian process source priors. On two fMRI data sets with different sampling frequency, we show that the GPICA-inferred temporal components and associated spatial maps allow for a more definite interpretation than standard temporal ICA methods. The temporal structures of the sources are controlled by the covariance of the Gaussian process, specified by a kernel function with an interpretable and controllable temporal length scale parameter. We propose a hierarchical model specification, considering both instantaneous and convolutive mixing, and we infer source spatial maps, temporal patterns and temporal length scale parameters by Markov Chain Monte Carlo. A companion implementation made as a plug-in for SPM can be downloaded from https://github.com/dittehald/GPICA.
Source-separated urine opens golden opportunities for microbial electrochemical technologies
Ledezma, Pablo; Kuntke, Philipp; Buisman, Cees J.N.; Keller, Jürg; Freguia, Stefano
2015-01-01
The food security of a booming global population demands a continuous and sustainable supply of fertilisers. Their current once-through use [especially of the macronutrients nitrogen (N), phosphorus (P), and potassium (K)] requires a paradigm shift towards recovery and reuse. In the case of sourc
Separation of blended impulsive sources using an iterative estimation-subtraction algorithm
Doulgeris, P.; Mahdad, A.; Blacquière, G.
2010-01-01
Traditional data acquisition practice dictates the existence of sufficient time intervals between the firing of sequential impulsive sources in the field. However, much attention has been drawn recently to the possibility of shooting in an overlapping fashion. Numerous publications have addressed th
Time-domain beamforming and blind source separation speech input in the car environment
Bourgeois, Julien
2009-01-01
The development of computer and telecommunication technologies led to a revolutioninthewaythatpeopleworkandcommunicatewitheachother.One of the results is that large amount of information will increasingly be held in a form that is natural for users, as speech in natural language. In the presented work, we investigate the speech signal capture problem, which includes the separation of multiple interfering speakers using microphone arrays. Adaptive beamforming is a classical approach which has been developed since the seventies. However it requires a double-talk detector (DTD) that interrupts th
Directory of Open Access Journals (Sweden)
Kellermann Walter
2007-01-01
Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.
Directory of Open Access Journals (Sweden)
Yen-Chun Chou
2010-01-01
Full Text Available Perfusion magnetic resonance brain imaging induces temporal signal changes on brain tissues, manifesting distinct blood-supply patterns for the profound analysis of cerebral hemodynamics. We employed independent factor analysis to blindly separate such dynamic images into different maps, that is, artery, gray matter, white matter, vein and sinus, and choroid plexus, in conjunction with corresponding signal-time curves. The averaged signal-time curve on the segmented arterial area was further used to calculate the relative cerebral blood volume (rCBV, relative cerebral blood flow (rCBF, and mean transit time (MTT. The averaged ratios for rCBV, rCBF, and MTT between gray and white matters for normal subjects were congruent with those in the literature.
This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.
Nilsson, M L; Waldebäck, M; Liljegren, G; Kulin, H; Markides, K E
2001-08-01
A method is presented in which pressurized-fluid extraction (PFE) is used for the extraction of chlorinated paraffins (CP) from the biodegradable fraction of source-separated household waste. The conditions that were optimized for high recovery in the extraction procedure were extraction time, temperature, and the use of different solvents and different sample particle sizes, Recoveries of CP from fortified household waste material were over 90%, with only few interferences when cyclohexane was used as solvent. Extraction yields from contaminated samples containing CP were further compared with recoveries obtained by use of Soxtec extraction. The results showed that PFE is a rapid, low-solvent-consuming technique, giving high yields.
Zeeman, Grietje; Kujawa, Katarzyna; de Mes, Titia; Hernandez, Lucia; de Graaff, Marthe; Abu-Ghunmi, Lina; Mels, Adriaan; Meulman, Brendo; Temmink, Hardy; Buisman, Cees; van Lier, Jules; Lettinga, Gatze
2008-01-01
Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas production. Post-treatment of the effluent is providing recovery of phosphorus and removal of remaining COD and nitrogen. The total energy saving of the new sanitation concept amounts to 200 MJ/year in comparison with conventional sanitation, moreover 0.14 kg P/p/year and 90 litres of potential reusable water are produced.
Energy Technology Data Exchange (ETDEWEB)
Bruegger, M.; Hildebrand, N.; Karlewski, T.; Trautmann, N. (Mainz Univ. (Germany, F.R.). Inst. fuer Kernchemie); Mazumdar, A.K. (Marburg Univ. (Germany, F.R.). Fachbereich Physik); Herrmann, G. (Mainz Univ. (Germany, F.R.). Inst. fuer Kernchemie; Gesellschaft fuer Schwerionenforschung m.b.H., Darmstadt (Germany, F.R.))
1985-02-01
The performance of a high temperature ion source coupled to a helium gas-jet transport system for an efficient mass separation of neutron-rich alkaline earth and lanthanide isotopes is reported and the results of overall efficiency measurements using different cluster materials in the gas-jet are given. A fast, microprocessor controlled tape transport system for ..gamma..-spectroscopic studies on short-lived isotopes is described. Some results on the decay of 3.8sub(-s) /sup 152/Pr are presented.
Hydrocyclonic separation of invasive New Zealand mudsnails from an aquaculture water source
Nielson, R. Jordan; Moffitt, Christine M.; Watten, Barnaby J.
2012-01-01
Invasive New Zealand mudsnails (Potamopyrgus antipodarum, NZMS) have infested freshwater aquaculture facilities in the western United States and disrupted stocking or fish transportation activities because of the risk of transporting NZMS to naive locations. We tested the efficacy of a gravity-fed, hydrocyclonicseparation system to remove NZMS from an aquaculture water source at two design flows: 367 L/min and 257 L/min. The hydrocyclone effectively filtered all sizes of snails (including newly emerged neonates) from inflows. We modeled cumulative recovery of three sizes of snails, and determined that both juvenile and adult sized snails were transported similarly through the filtration system, but the transit of neonates was faster and similar to the transport of water particles. We found that transit times through the filtration system were different between the two flows regardless of snail size, and the hydrocyclone filter operated more as a plug flow system with dispersion, especially when transporting and removing the larger sized adult and juvenile sized snails. Our study supports hydrocyclonic filtration as an important tool to provide snail free water for aquaculture operations that require uninfested water sources.
Comparing Ion Exchange Adsorbents for Nitrogen Recovery from Source-Separated Urine.
Tarpeh, William A; Udert, Kai M; Nelson, Kara L
2017-02-21
Separate collection of urine, which is only 1% of wastewater volume but contains the majority of nitrogen humans excrete, can potentially reduce the costs and energy input of wastewater treatment and facilitate recovery of nitrogen for beneficial use. Ion exchange was investigated for recovery of nitrogen as ammonium from urine for use as a fertilizer or disinfectant. Cation adsorption curves for four adsorbents (clinoptilolite, biochar, Dowex 50, and Dowex Mac 3) were compared in pure salt solutions, synthetic urine, and real stored urine. Competition from sodium and potassium present in synthetic and real urine did not significantly decrease ammonium adsorption for any of the adsorbents. Dowex 50 and Dowex Mac 3 showed nearly 100% regeneration efficiencies. Estimated ion exchange reactor volumes to capture the nitrogen for 1 week from a four-person household were lowest for Dowex Mac 3 (5 L) and highest for biochar (19 L). Although Dowex Mac 3 had the highest adsorption capacity, material costs ($/g N removed) were lower for clinoptilolite and biochar because of their substantially lower unit cost.
Bayesian astrostatistics: a backward look to the future
Loredo, Thomas J
2012-01-01
This perspective chapter briefly surveys: (1) past growth in the use of Bayesian methods in astrophysics; (2) current misconceptions about both frequentist and Bayesian statistical inference that hinder wider adoption of Bayesian methods by astronomers; and (3) multilevel (hierarchical) Bayesian modeling as a major future direction for research in Bayesian astrostatistics, exemplified in part by presentations at the first ISI invited session on astrostatistics, commemorated in this volume. It closes with an intentionally provocative recommendation for astronomical survey data reporting, motivated by the multilevel Bayesian perspective on modeling cosmic populations: that astronomers cease producing catalogs of estimated fluxes and other source properties from surveys. Instead, summaries of likelihood functions (or marginal likelihood functions) for source properties should be reported (not posterior probability density functions), including nontrivial summaries (not simply upper limits) for candidate objects ...
Korucu, M Kemal; Kaplan, Özgür; Büyük, Osman; Güllü, M Kemal
2016-10-01
In this study, we investigate the usability of sound recognition for source separation of packaging wastes in reverse vending machines (RVMs). For this purpose, an experimental setup equipped with a sound recording mechanism was prepared. Packaging waste sounds generated by three physical impacts such as free falling, pneumatic hitting and hydraulic crushing were separately recorded using two different microphones. To classify the waste types and sizes based on sound features of the wastes, a support vector machine (SVM) and a hidden Markov model (HMM) based sound classification systems were developed. In the basic experimental setup in which only free falling impact type was considered, SVM and HMM systems provided 100% classification accuracy for both microphones. In the expanded experimental setup which includes all three impact types, material type classification accuracies were 96.5% for dynamic microphone and 97.7% for condenser microphone. When both the material type and the size of the wastes were classified, the accuracy was 88.6% for the microphones. The modeling studies indicated that hydraulic crushing impact type recordings were very noisy for an effective sound recognition application. In the detailed analysis of the recognition errors, it was observed that most of the errors occurred in the hitting impact type. According to the experimental results, it can be said that the proposed novel approach for the separation of packaging wastes could provide a high classification performance for RVMs.
Hirayama, Y.; Watanabe, Y. X.; Imai, N.; Ishiyama, H.; Jeong, S. C.; Jung, H. S.; Miyatake, H.; Oyaizu, M.; Kimura, S.; Mukai, M.; Kim, Y. H.; Sonoda, T.; Wada, M.; Huyse, M.; Kudryavtsev, Yu.; Van Duppen, P.
2016-06-01
KEK Isotope Separation System (KISS) has been developed at RIKEN to produce neutron rich isotopes with N = 126 to study the β -decay properties for application to astrophysics. The KISS is an element-selective mass-separation system which consists of an argon gas cell-based on laser ion source for atomic number selection and an ISOL mass-separation system. The argon gas cell of KISS is a key component to stop and collect the unstable nuclei produced in a multi-nucleon transfer reaction, where the isotopes of interest will be selectively ionized using laser resonance ionization. We have performed off- and on-line experiments to study the basic properties of the gas cell as well as of the KISS. We successfully extracted the laser-ionized stable 56Fe (direct implantation of a 56Fe beam into the gas cell) atoms and 198Pt (emitted from the 198Pt target by elastic scattering with a 136Xe beam) atoms from the KISS during the commissioning on-line experiments. We furthermore extracted laser-ionized unstable 199Pt atoms and confirmed that the measured half-life was in good agreement with the reported value.
Kilohertz Quasi-Periodic Oscillation Peak Separation is not Constant in the Atoll Source 4U 1608-52
Méndez, M; Wijnands, R; Ford, E C; Van Paradijs, J; Vaughan, B A; Méndez, Mariano; Van der Klis, Michiel; Wijnands, Rudy; Ford, Eric C.; Van Paradijs, Jan; Vaughan, Brian A.
1998-01-01
We present new Rossi X-ray Timing Explorer observations of the low-mass X-ray binary 4U 1608-52 during the decay of its 1998 outburst. We detect by a direct FFT method the existence of a second kilohertz quasi-periodic oscillation (kHz QPO) in its power density spectrum, previously only seen by means of the sensitivity-enhancing `shift and add' technique. This result confirms that 4U 1608-52 is a twin kHz QPO source. The frequency separation between these two QPO decreased significantly, from 325.5 +/- 3.4 Hz to 225.3 +/- 12.0 Hz, as the frequency of the lower kHz QPO increased from 470 Hz to 865 Hz, in contradiction with a simple beat-frequency interpretation. This change in the peak separation of the kHz QPOs is closely similar to that previously seen in Sco X-1, but takes place at a ten times lower average luminosity. We discuss this result within the framework of models that have been proposed for kHz QPO. Beat frequency models where the peak separation is identified with the neutron star spin rate, as we...
A method for extracting fetal ECG based on EMD-NMF single channel blind source separation algorithm.
He, Pengju; Chen, Xiaomeng
2015-01-01
Nowadays, detecting fetal ECG using abdominal signal is a commonly used method, but fetal ECG signal will be affected by maternal ECG. Current FECG extraction algorithms are mainly aiming at multiple channels signal. They often assume there is only one fetus and did not consider multiple births. This paper proposed a single channel blind source separation algorithm to process single abdominal acquired signal. This algorithm decomposed single abdominal signal into multiple intrinsic mode function (IMF) utilizing empirical mode decomposition (EMD). Correlation matrix of IMF was calculated and independent ECG signal number was estimated using eigenvalue method. Nonnegative matrix was constructed according to determined number and decomposed IMF. Separation of MECG and FECG was achieved utilizing nonnegative matrix factorization (NMF). Experiments selected four channels man-made signal and two channels ECG to verify correctness and feasibility of proposed algorithm. Results showed that the proposed algorithm could determine number of independent signal in single acquired signal. FECG could be extracted from single channel observed signal and the algorithm can be used to solve separation of MECG and FECG.
Rousta, Kamran; Bolton, Kim; Lundin, Magnus; Dahlén, Lisa
2015-06-01
The present study measures the participation of households in a source separation scheme and, in particular, if the household's application of the scheme improved after two interventions: (a) shorter distance to the drop-off point and (b) easy access to correct sorting information. The effect of these interventions was quantified and, as far as possible, isolated from other factors that can influence the recycling behaviour. The study was based on households located in an urban residential area in Sweden, where waste composition studies were performed before and after the interventions by manual sorting (pick analysis). Statistical analyses of the results indicated a significant decrease (28%) of packaging and newsprint in the residual waste after establishing a property close collection system (intervention (a)), as well as significant decrease (70%) of the miss-sorted fraction in bags intended for food waste after new information stickers were introduced (intervention (b)). Providing a property close collection system to collect more waste fractions as well as finding new communication channels for information about sorting can be used as tools to increase the source separation ratio. This contribution also highlights the need to evaluate the effects of different types of information and communication concerning sorting instructions in a property close collection system.
Energy Technology Data Exchange (ETDEWEB)
Schumann, D.; Stowasser, T. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Ayranov, M. [Commission of the European Communities, Luxembourg (Luxembourg). Directorate General Energy; Gialanella, L. [Seconda Univ. di Napoli, Caserta (Italy). Dipt. di Matematica e Fisica; Istituto Nazionale di Fisica Nucleare, Sezione di Napoli (Italy); Di Leva, A.; Romano, M.; Schuermann, D. [Istituto Nazionale di Fisica Nucleare, Sezione di Napoli (Italy); Universita di Napoli Frederico II (Italy). Dipt. di Fisica
2013-10-01
{sup 7}Be is a key radionuclide for investigation of several astrophysical processes and phenomena. In addition, it is used as a tracer in war measurements. It is produced in considerable amounts in the cooling water (D{sub 2}O) of the Spallation Induced Neutron Source (SINQ) facility at PSI by spallation reactions on {sup 16}O with the generated fast neutrons. A shielded ion-exchange filter containing 100 mL of the mixed-bed ion exchanger LEWATIT was installed as a bypass for the cooling water into the cooling loop of SINQ for three months. The collected activity of {sup 7}Be was in the range of several hundred GBq. Further, the {sup 7}Be was separated and purified in a hot-cell remotely-controlled using a separation system installed. With the exception of {sup 10}Be, radioactive by-products can be neglected, so that this cooling water could serve as an ideal source for highly active {sup 7}Be-samples. The facility is capable of producing {sup 7}Be with activities up to 1 TBq per year. The {sup 7}Be sample preparation is described in detail and the possible uses are discussed. In particular some preliminary results of {sup 7}Be ion beam production are presented. (orig.)
Bayesian Games with Intentions
Directory of Open Access Journals (Sweden)
Adam Bjorndahl
2016-06-01
Full Text Available We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Multisnapshot Sparse Bayesian Learning for DOA
DEFF Research Database (Denmark)
Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki
2016-01-01
The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source p...
稀疏盲源分离快速算法%A Fast Algorithm for Blind Sparse Source Separation
Institute of Scientific and Technical Information of China (English)
董天宝; 杨景曙
2012-01-01
In this paper, a fast algorithm for blind sparse source separation is proposed. Sources are estimated by means of minimizing (0-norm which is approximated using a predefined continuous and differentiable function. The proposed algorithm is easy to implement and runs fast. Then the algorithm is compared with several fast sparse reconstruction algorithms such as fast (1-norm minimization algorithm and OMP using synthetic data. Finally, we apply the proposed algorithm to underdetermined blind source separation using real world data. It is experimentally shown that the proposed algorithm runs faster than other algorithms, while acquiring almost the same (or better) quality.%提出一种快速的稀疏信号重构算法,通过定义一个连续可微函数近似l0范数,采用最小化l0范数的方法实现对稀疏源信号的估计.该算法的特点是实现简单,速度快.采用人工生成的信号将算法与通过l1范数最小化的快速稀疏信号重构算法和OMP算法进行了比较.最后,将该算法用于实际信号的欠定盲源分离.仿真实验表明,算法在保证信号分离性能的前提下大幅度提高了算法的运行速度.
New algorithm for underdetermined blind source separation%一种欠定盲源分离新算法
Institute of Scientific and Technical Information of China (English)
董天宝; 杨景曙
2012-01-01
A new two-step algorithm for underdetermined source separation is proposed. Mixing matrix is estimated using clustering methods. Sources are estimated using a fast sparse reconstructed algorithm which defines a continuous and differential function so as to approximate ∮° -norm. The new algorithm runs fast and is easily implemented. It is experimentally shown that the proposed algorithm runs faster than other two underdetermined source separation algorithms using fast minimization ∮1 -norm and OMP methods, while acquiring almost the same quality.%提出了一种基于两步法的欠定盲源分离新算法.在混合矩阵估计阶段,采用基于势函数的聚类方法,在源信号恢复阶段,提出一种快速的稀疏信号重构算法,通过定义一个连续可微函数来近似l0范数,使得l0范数可解.该算法的特点是实现简单、速度快.仿真实验表明,与现有的采用快速l1范数最小化和OMP算法的欠定盲源分离方法相比,提出的算法在保证分离性能的前提下大幅度提高了算法的运行速度.
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-01-01
This study compared the environmental profiles of anaerobic digestion (AD) and incineration, in relation to global warming potential (GWP), for treating individual material fractions that may occur in source-separated organic household waste (SSOHW). Different framework conditions representative...
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.
Improved γ/hadron separation for the detection of faint γ-ray sources using boosted decision trees
Krause, Maria; Pueschel, Elisa; Maier, Gernot
2017-03-01
Imaging atmospheric Cherenkov telescopes record an enormous number of cosmic-ray background events. Suppressing these background events while retaining γ-rays is key to achieving good sensitivity to faint γ-ray sources. The differentiation between signal and background events can be accomplished using machine learning algorithms, which are already used in various fields of physics. Multivariate analyses combine several variables into a single variable that indicates the degree to which an event is γ-ray-like or cosmic-ray-like. In this paper we will focus on the use of "boosted decision trees" for γ/hadron separation. We apply the method to data from the Very Energetic Radiation Imaging Telescope Array System (VERITAS), and demonstrate an improved sensitivity compared to the VERITAS standard analysis.
DEFF Research Database (Denmark)
Müller, L.; Schultz, Anna Charlotte; Fonager, J.
2015-01-01
Norovirus outbreaks occur frequently in Denmark and it can be difficult to establish whether apparently independent outbreaks have the same origin. Here we report on six outbreaks linked to frozen raspberries, investigated separately over a period of 3 months. Norovirus from stools were sequence...... capsid P2 region. In one outbreak at a hospital canteen, frozen raspberries was associated with illness by cohort investigation (relative risk 6·1, 95% confidence interval 3·2–11). Bags of raspberries suspected to be the source were positive for genogroup I and II noroviruses, one typable virus...... was genotype GI.6 (capsid). These molecular investigations showed that the apparently independent outbreaks were the result of one contamination event of frozen raspberries. The contaminated raspberries originated from a single producer in Serbia and were originally not considered to belong to the same batch...
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-01-01
This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated...... and then, secondly, laboratory analyses for eight organic material fractions comprising Danish SSOHW were conducted. No data were found in the literature that fully covered the objectives of the present study. Based on laboratory analyses, all fractions were assigned according to their specific properties...... in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material degradability...
Chung, Wen-Yan; Cruz, Febus Reidj G.; Szu, Harold; Pijanowska, Dorota G.; Dawgul, Marek; Torbicz, Wladyslaw; Grabiec, Piotr B.; Jarosewicz, Bohdan; Chiang, Jung-Lung; Cheng, Cheanyeh; Chang, Kuo-Chung; Truc, Le Thanh; Lin, Wei-Chiang
2010-04-01
This paper presents an electronic tongue system with blind source separation (BSS) and wireless sensor network (WSN) for remote multi-ion sensing applications. Electrochemical sensors, such as ion-sensitive field-effect transistor (ISFET) and extended-gate field-effect transistor (EGFET), only provide the combined concentrations of all ions in aqueous solutions. Mixed hydrogen and sodium ions in chemical solutions are observed by means of H+ ISFET and H+ EGFET sensor array. The BSS extracts the concentration of individual ions using independent component analysis (ICA). The parameters of ISFET and EGFET sensors serve as a priori knowledge that helps solve the BSS problem. Using wireless transceivers, the ISFET/EGFET modules are realized as wireless sensor nodes. The integration of WSN technology into our electronic tongue system with BSS capability makes distant multi-ion measurement viable for environment and water quality monitoring.
Jaatinen, Sanna T; Palmroth, Marja R T; Rintala, Jukka A; Tuhkanen, Tuula A
2016-09-01
The behaviour of pharmaceuticals related to the human immunodeficiency virus treatment was studied in the liquid phase of source-separated urine during six-month storage at 20°C. Six months is the recommended time for hygienization and use of urine as fertilizer. Compounds were spiked in urine as concentrations calculated to appear in urine. Assays were performed with separate compounds and as therapeutic groups of antivirals, antibiotics and anti-tuberculotics. In addition, urine was amended either with faeces or urease inhibitor. The pharmaceutical concentrations were monitored from filtered samples with solid phase extraction and liquid chromatography. The concentration reductions of the studied compounds as such or with amendments ranged from less than 1% to more than 99% after six-month storage. The reductions without amendments were 41.9-99% for anti-tuberculotics; <52% for antivirals (except with 3TC 75.6%) and <50% for antibiotics. In assays with amendments, the reductions were all <50%. Faeces amendment resulted in similar or lower reduction than without it even though bacterial activity should have increased. The urease inhibitor prevented ureolysis and pH rise but did not affect pharmaceutical removal. In conclusion, removal during storage might not be enough to reduce risks associated with the studied pharmaceuticals, in which case other feasible treatment practises or urine utilization means should be considered.
Directory of Open Access Journals (Sweden)
Giovanni De Feo
2016-10-01
Full Text Available This study performed a Life Cycle Assessment of the collection, transport, treatment and disposal of source-separated municipal waste (MW in Baronissi, a town of 17,000 inhabitants in the Campania region of Italy. Baronissi is a high-performing town in a region with scarcity of MW facilities. The environmental impacts were assessed with three different methods—IPCC 2007, Ecological Footprint and ReCiPe 2008—in order to evaluate how they influence the results as well as how the global warming affects the results, since it is one of the major environmental concerns of people. The obtained results showed how the presence of facilities in the area is fundamental. Their lack means high environmental loads due to the transportation of materials for long distances, particularly for the organic fraction. The presence of a composting plant at 10 km from the municipality would result in a decrease of 65% of the impacts due to the external transport, regardless of the evaluation method. The results obtained with ReCiPe 2008 and Ecological Footprint agreed, while those obtained with IPCC 2007 were very different since global warming is strongly affected by the transport phase. IPCC 2007 does not allow to take into account the advantages obtainable with a good level of separate collection. Considering a single impact evaluation method, there is a high risk of coming to misleading conclusions.
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Institute of Scientific and Technical Information of China (English)
苏宏升
2008-01-01
To make conventional Bayesian optimal classifier possess the abilities of disposing fuzzy information and realizing the automation of reasoning process, a new Bayesian optimal classifier is proposed with fuzzy information embedded. It can not only dispose fuzzy information effectively, but also retain learning properties of Bayesian optimal classifier. In addition, according to the evolution of fuzzy set theory, vague set is also imbedded into it to generate vague Bayesian optimal classifier. It can simultaneously simulate the twofold characteristics of fuzzy information from the positive and reverse directions. Further, a set pair Bayesian optimal classifier is also proposed considering the threefold characteristics of fuzzy information from the positive, reverse, and indeterminate sides. In the end, a knowledge-based artificial neural network (KBANN) is presented to realize automatic reasoning of Bayesian optimal classifier. It not only reduces the computational cost of Bayesian optimal classifier but also improves its classification learning quality.
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network
Octavian
2015-01-01
Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...
Institute of Scientific and Technical Information of China (English)
Xiang Wang; Zhitao Huang; Yiyu Zhou
2014-01-01
This paper deals with the blind separation of nonstation-ary sources and direction-of-arrival (DOA) estimation in the under-determined case, when there are more sources than sensors. We assume the sources to be time-frequency (TF) disjoint to a certain extent. In particular, the number of sources presented at any TF neighborhood is strictly less than that of sensors. We can identify the real number of active sources and achieve separation in any TF neighborhood by the sparse representation method. Compared with the subspace-based algorithm under the same sparseness assumption, which suffers from the extra noise effect since it can-not estimate the true number of active sources, the proposed algorithm can estimate the number of active sources and their cor-responding TF values in any TF neighborhood simultaneously. An-other contribution of this paper is a new estimation procedure for the DOA of sources in the underdetermined case, which combines the TF sparseness of sources and the clustering technique. Sim-ulation results demonstrate the validity and high performance of the proposed algorithm in both blind source separation (BSS) and DOA estimation.
Model Diagnostics for Bayesian Networks
Sinharay, Sandip
2006-01-01
Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…
Institute of Scientific and Technical Information of China (English)
孟宗; 蔡龙
2014-01-01
针对传统独立分量分析难以解决机械故障诊断中存在的相关源信号盲分离、欠定盲分离等问题，在相关振源信号部分子带满足统计独立的假设前提下，提出基于总体经验模态分解子带提取相关机械源单通道盲源分离方法。该方法将单通道观测信号进行总体经验模态分解获得到子带观测信号，将单通道信号及子带观测信号组成新的多维信号，利用奇异值分解及贝叶斯准则估计源信号数目；据互信息标准与源信号数目选若干独立性较强的子带观测信号重构，实现信号升维；对重构的观测信号进行白化预处理及联合近似对角化，获得源信号估计。并仿真、实验验证该方法在机械故障诊断中的有效性。%The traditional independent component analysis is too difficult to solve the problems of underdetermined blind source separation(BSS)and statistically correlated sources separation in mechanical fault diagnosis.Under the assumption of statistical indpendence between some sub-components of correlated machine vibration sources,a novel blind source separation method based on subband extraction of ensemble empirical mode decomposition (EEMD)was proposed to solve the problem of single-channel statistically correlated mechanical signals separation.In the method,the single-channel signal was decomposed into a series of subband observed signals by ensemble empirical mode decomposition,then the number of source signals was estimated by singular value decomposition and Bayesian information criterion.New observed signals were reconstructed by using some selected subband observed signals with high independence according to the mutual information criterion and the number of sources,and the dimension of the new observed signal was increased. The source signals were estimated through the reconstructed observed signals by using whitening preprocessing and joint approximate diagonalization
Zöllig, Hanspeter; Fritzsche, Cristina; Morgenroth, Eberhard; Udert, Kai M
2015-02-01
Electrolysis can be a viable technology for ammonia removal from source-separated urine. Compared to biological nitrogen removal, electrolysis is more robust and is highly amenable to automation, which makes it especially attractive for on-site reactors. In electrolytic wastewater treatment, ammonia is usually removed by indirect oxidation through active chlorine which is produced in-situ at elevated anode potentials. However, the evolution of chlorine can lead to the formation of chlorate, perchlorate, chlorinated organic by-products and chloramines that are toxic. This study focuses on using direct ammonia oxidation on graphite at low anode potentials in order to overcome the formation of toxic by-products. With the aid of cyclic voltammetry, we demonstrated that graphite is active for direct ammonia oxidation without concomitant chlorine formation if the anode potential is between 1.1 and 1.6 V vs. SHE (standard hydrogen electrode). A comparison of potentiostatic bulk electrolysis experiments in synthetic stored urine with and without chloride confirmed that ammonia was removed exclusively by continuous direct oxidation. Direct oxidation required high pH values (pH > 9) because free ammonia was the actual reactant. In real stored urine (pH = 9.0), an ammonia removal rate of 2.9 ± 0.3 gN·m(-2)·d(-1) was achieved and the specific energy demand was 42 Wh·gN(-1) at an anode potential of 1.31 V vs. SHE. The measurements of chlorate and perchlorate as well as selected chlorinated organic by-products confirmed that no chlorinated by-products were formed in real urine. Electrode corrosion through graphite exfoliation was prevented and the surface was not poisoned by intermediate oxidation products. We conclude that direct ammonia oxidation on graphite electrodes is a treatment option for source-separated urine with three major advantages: The formation of chlorinated by-products is prevented, less energy is consumed than in indirect ammonia oxidation and
Dong, Jun; Ni, Mingjiang; Chi, Yong; Zou, Daoan; Fu, Chao
2013-08-01
In China, the continuously increasing amount of municipal solid waste (MSW) has resulted in an urgent need for changing the current municipal solid waste management (MSWM) system based on mixed collection. A pilot program focusing on source-separated MSW collection was thus launched (2010) in Hangzhou, China, to lessen the related environmental loads. And greenhouse gas (GHG) emissions (Kyoto Protocol) are singled out in particular. This paper uses life cycle assessment modeling to evaluate the potential environmental improvement with regard to GHG emissions. The pre-existing MSWM system is assessed as baseline, while the source separation scenario is compared internally. Results show that 23 % GHG emissions can be decreased by source-separated collection compared with the base scenario. In addition, the use of composting and anaerobic digestion (AD) is suggested for further optimizing the management of food waste. 260.79, 82.21, and -86.21 thousand tonnes of GHG emissions are emitted from food waste landfill, composting, and AD, respectively, proving the emission reduction potential brought by advanced food waste treatment technologies. Realizing the fact, a modified MSWM system is proposed by taking AD as food waste substitution option, with additional 44 % GHG emissions saved than current source separation scenario. Moreover, a preliminary economic assessment is implemented. It is demonstrated that both source separation scenarios have a good cost reduction potential than mixed collection, with the proposed new system the most cost-effective one.
Bayesian Lensing Shear Measurement
Bernstein, Gary M
2013-01-01
We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...
Fox, G.J.A.; Berg, van den S.M.; Veldkamp, B.P.; Irwing, P.; Booth, T.; Hughes, D.
2015-01-01
In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item resp
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Beler-Baykal, B; Allar, A D; Bayram, S
2011-01-01
The use of source separated human urine as fertilizer is one of the major suggestions of the new sanitation concept ECOSAN. Urine is rich in nitrogen, phosphorus and potassium which act as plant nutrients, however its salinity is high for agricultural and landscape purposes. Moreover, characteristics change significantly throughout storage where salinity increases to higher values as the predominant form of nitrogen shifts from urea to ammonium. Transferring nitrogen in human urine onto the natural zeolite clinoptilolite and using the subsequently recovered ammonium from the exhausted clinoptilolite for agricultural/landscape purposes is suggested as an indirect route of using urine in this work. Results reporting the outcome of the proposed process together with characterization of fresh and stored urine, and preliminary work on the application of the product on the landscape plant Ficus elastica are presented. Up to 97% of the ammonium in stored urine could be transferred onto clinoptilolite through ion exchange and about 88% could be recovered subsequently from exhausted clinoptilolite, giving an overall recovery of 86%. Another important merit of the suggested process was the successful elimination of salinity. Preliminary experiments with Ficus elastica had shown that the product, i.e. clinoptilolite exhausted with ammonium, was compatible with the synthetic fertilizer tested.
Isomer separation of $^{70g}Cu$ and $^{70m}Cu$ with a resonance ionization laser ion source
Köster, U; Mishin, V I; Weissman, L; Huyse, M; Kruglov, K; Müller, W F; Van Duppen, P; Van Roosbroeck, J; Thirolf, P G; Thomas, H C; Weisshaar, D W; Schulze, W; Borcea, R; La Commara, M; Schatz, H; Schmidt, K; Röttger, S; Huber, G; Sebastian, V; Kratz, K L; Catherall, R; Georg, U; Lettry, Jacques; Oinonen, M; Ravn, H L; Simon, H
2000-01-01
Radioactive copper isotopes were ionized with the resonance ionization laser ion source at the on-line isotope separator ISOLDE (CERN). Using the different hyperfine structure in the 3d/sup 10/ 4s /sup 2/S/sub 1/2/-3d/sup 10/ 4p /sup 2/P/sub 1/2//sup 0/ transition the low- and high-spin isomers of /sup 70/Cu were selectively enhanced by tuning the laser wavelength. The light was provided by a narrow-bandwidth dye laser pumped by copper vapor lasers and frequency doubled in a BBO crystal. The ground state to isomeric state intensity ratio could be varied by a factor of 30, allowing to assign gamma transitions unambiguously to the decay of the individual isomers. It is shown that the method can also be used to determine magnetic moments. In a first experiment for the 1/sup +/ ground state of /sup 70/Cu a magnetic moment of (+)1.8(3) mu /sub N/ and for the high-spin isomer of /sup 70/Cu a magnetic moment of (+or-)1.2(3) mu /sub N/ could be deduced. (20 refs).
Greening, Gage J.; Rajaram, Narasimhan; Muldoon, Timothy J.
2016-03-01
In the non-keratinized epithelia, dysplasia typically arises near the basement membrane and proliferates into the upper epithelial layers over time. We present a non-invasive, multimodal technique combining high-resolution fluorescence imaging and broadband sub-diffuse reflectance spectroscopy (sDRS) to monitor health at various tissue layers. This manuscript focuses on characterization of the sDRS modality, which contains two source-detector separations (SDSs) of 374 μm and 730 μm, so that it can be used to extract in vivo optical parameters from human oral mucosa at two tissue thicknesses. First, we present empirical lookup tables (LUTs) describing the relationship between reduced scattering (μs') and absorption coefficients (μa) and absolute reflectance. LUTS were shown to extract μs' and μa with accuracies of approximately 4% and 8%, respectively. We then present LUTs describing the relationship between μs', μa and sampling depth. Sampling depths range between 210-480 and 260-620 μm for the 374 and 730 μm SDSs, respectively. We then demonstrate the ability to extract in vivo μs', μa, hemoglobin concentration, bulk tissue oxygen saturation, scattering exponent, and sampling depth from the inner lip of thirteen healthy volunteers to elucidate the differences in the extracted optical parameters from each SDS (374 and 730 μm) within non-keratinized squamous epithelia.
Directory of Open Access Journals (Sweden)
Lin Cheng
2016-01-01
Full Text Available In this work, using AVT data, a health monitoring method for concrete dams based on two different blind source separation (BSS methods, that is, second-order blind identification (SOBI and independent component analysis (ICA, is proposed. A modal identification procedure, which integrates the SOBI algorithm and modal contribution, is first adopted to extract structural modal features using AVT data. The method to calculate the modal contribution index for SOBI-based modal identification methods is studied, and the calculated modal contribution index is used to determine the system order. The selected modes are then used to calculate modal features and are analysed using ICA to extract some independent components (ICs. The square prediction error (SPE index and its control limits are then calculated to construct a control chart for the structural dynamic features. For new AVT data of a dam in an unknown health state, the newly calculated SPE is compared with the control limits to judge whether the dam is normal. With the simulated AVT data of the numerical model for a concrete gravity dam and the measured AVT data of a practical engineering project, the performance of the dam health monitoring method proposed in this paper is validated.
Bayesian long branch attraction bias and corrections.
Susko, Edward
2015-03-01
Previous work on the star-tree paradox has shown that Bayesian methods suffer from a long branch attraction bias. That work is extended to settings involving more taxa and partially resolved trees. The long branch attraction bias is confirmed to arise more broadly and an additional source of bias is found. A by-product of the analysis is methods that correct for biases toward particular topologies. The corrections can be easily calculated using existing Bayesian software. Posterior support for a set of two or more trees can thus be supplemented with corrected versions to cross-check or replace results. Simulations show the corrections to be highly effective.
Institute of Scientific and Technical Information of China (English)
方勇; 张烨
2008-01-01
In underdetermined blind source separation, more sources are to be estimated from less observed mixtures without knowing source signals and the mixing matrix. This paper presents a robust clustering algorithm for underdetermined blind separation of sparse sources with unknown number of sources in the presence of noise. It uses the robust competitive agglomeration (RCA) algorithm to estimate the source number and the mixing matrix, and the source signals then are recovered by using the interior point linear programming. Simulation results show good performance of the proposed algorithm for underdetermined blind sources separation (UBSS).
Multi-Fraction Bayesian Sediment Transport Model
Directory of Open Access Journals (Sweden)
Mark L. Schmelter
2015-09-01
Full Text Available A Bayesian approach to sediment transport modeling can provide a strong basis for evaluating and propagating model uncertainty, which can be useful in transport applications. Previous work in developing and applying Bayesian sediment transport models used a single grain size fraction or characterized the transport of mixed-size sediment with a single characteristic grain size. Although this approach is common in sediment transport modeling, it precludes the possibility of capturing processes that cause mixed-size sediments to sort and, thereby, alter the grain size available for transport and the transport rates themselves. This paper extends development of a Bayesian transport model from one to k fractional dimensions. The model uses an existing transport function as its deterministic core and is applied to the dataset used to originally develop the function. The Bayesian multi-fraction model is able to infer the posterior distributions for essential model parameters and replicates predictive distributions of both bulk and fractional transport. Further, the inferred posterior distributions are used to evaluate parametric and other sources of variability in relations representing mixed-size interactions in the original model. Successful OPEN ACCESS J. Mar. Sci. Eng. 2015, 3 1067 development of the model demonstrates that Bayesian methods can be used to provide a robust and rigorous basis for quantifying uncertainty in mixed-size sediment transport. Such a method has heretofore been unavailable and allows for the propagation of uncertainty in sediment transport applications.
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Directory of Open Access Journals (Sweden)
Vincent Zoes
2011-02-01
Full Text Available A greenhouse experiment was conducted to evaluate the use of growth substrates, made with duck excreta enriched wood shaving compost (DMC and the organic fraction of source-separated municipal solid waste (MSW compost, on the growth and yield of tomato (Lycopersicum esculentum Mill. cv. Campbell 1327. Substrate A consisted of 3:2 (W/W proportion of DMC and MSW composts. Substrates B and C were the same as A but contained 15% (W/W ratio of brick dust and shredded plastic, respectively. Three control substrates consisted of the commercially available peat-based substrate (Pr, an in-house sphagnum peat-based substrate (Gs, and black earth mixed with sandy loam soil (BE/S in a 1:4 (W/W ratio. Substrates (A, B, C and controls received nitrogen (N, phosphate (P and potassium (K at equivalent rates of 780 mg/pot, 625 mg/pot, and 625 mg/pot, respectively, or were used without mineral fertilizers. Compared to the controls (Pr, Gs and BE/S, tomato plants grown on A, B, and C produced a greater total number and dry mass of fruits, with no significant differences between them. On average, total plant dry-matter biomass in substrate A, B, and C was 19% lower than that produced on Pr, but 28% greater than biomass obtained for plant grown, on Gs and BE/S. Plant height, stem diameter and chlorophyll concentrations indicate that substrates A, B, and C were particularly suitable for plant growth. Although the presence of excess N in composted substrates favoured vegetative rather than reproductive growth, the continuous supply of nutrients throughout the growing cycle, as well as the high water retention capacity that resulted in a reduced watering by 50%, suggest that substrates A, B, and C were suitable growing mixes, offering environmental and agronomic advantages.
Chen, Liping; Yang, Xiaoxiao; Tian, Xiujun; Yao, Song; Li, Jiuyi; Wang, Aimin; Yao, Qian; Peng, Dangcong
2017-12-01
The combination of partial nitritation (PN) and anaerobic ammonium oxidation (anammox) has been proposed as an ideal process for nitrogen removal from source-separated urine, while the high organic matters in urine cause instability of single-stage PN-anammox process. This study aims to remove the organic matters and partially nitrify the nitrogen in urine, producing an ammonium/nitrite solution suitable for anammox. The organic matters in stored urine were used as the electron donors to achieve 40% total nitrogen removal in nitritation-denitrification process in a sequencing batch reactor (SBR). Granular aggregates were observed and high mixed liquor suspended solids (9.5 g/L) were maintained in the SBR. Around 70-75% ammonium was oxidized to nitrite under the volumetric loading rates of 3.23 kg chemical oxygen demand (COD)/(m(3) d) and 1.86 kg N/(m(3) d), respectively. The SBR produced an ammonium/nitrite solution free of biodegradable organic matters, with a NO2(-)-N:NH4(+)-N of 1.24 ± 0.13. Fluorescence in situ hybridization images showed that Nitrosomonas-like ammonium-oxidizing bacteria, accounting for 7.2% of total bacteria, located in the outer layer (25 μm), while heterotrophs distributed homogeneously throughout the granular aggregates. High concentrations of free ammonia and nitrous acids in the reactor severely inhibited the growth of nitrite-oxidizing bacteria, resulting in their absence in the granular sludge. The microbial diversity analysis indicated Proteobacteria was the predominant phylum, in which Pseudomonas was the most abundant genus.
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-04-01
This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated and then, secondly, laboratory analyses for eight organic material fractions comprising Danish SSOHW were conducted. No data were found in the literature that fully covered the objectives of the present study. Based on laboratory analyses, all fractions were assigned according to their specific properties in relation to BMP, protein content, lipids, lignocellulose biofibres and easily degradable carbohydrates (carbohydrates other than lignocellulose biofibres). The three components in lignocellulose biofibres, i.e. lignin, cellulose and hemicellulose, were differentiated, and theoretical BMP (TBMP) and material degradability (BMP from laboratory incubation tests divided by TBMP) were expressed. Moreover, the degradability of lignocellulose biofibres (the share of volatile lignocellulose biofibre solids degraded in laboratory incubation tests) was calculated. Finally, BMP for average SSOHW composition in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material degradability of material fractions such as vegetation waste, moulded fibres, animal straw, dirty paper and dirty cardboard, however, was constrained by lignin content. BMP for overall SSOHW (untreated) was 404 mL CH4 per g VS, which might increase if the relative content of material fractions, such as animal and vegetable food waste, kitchen tissue and dirty paper in the waste, becomes larger.
Nara, T.; Koiwa, K.; Takagi, S.; Oyama, D.; Uehara, G.
2014-05-01
This paper presents an algebraic reconstruction method for dipole-quadrupole sources using magnetoencephalography data. Compared to the conventional methods with the equivalent current dipoles source model, our method can more accurately reconstruct two close, oppositely directed sources. Numerical simulations show that two sources on both sides of the longitudinal fissure of cerebrum are stably estimated. The method is verified using a quadrupolar source phantom, which is composed of two isosceles-triangle-coils with parallel bases.
Bayesian Face Sketch Synthesis.
Wang, Nannan; Gao, Xinbo; Sun, Leiyu; Li, Jie
2017-03-01
Exemplar-based face sketch synthesis has been widely applied to both digital entertainment and law enforcement. In this paper, we propose a Bayesian framework for face sketch synthesis, which provides a systematic interpretation for understanding the common properties and intrinsic difference in different methods from the perspective of probabilistic graphical models. The proposed Bayesian framework consists of two parts: the neighbor selection model and the weight computation model. Within the proposed framework, we further propose a Bayesian face sketch synthesis method. The essential rationale behind the proposed Bayesian method is that we take the spatial neighboring constraint between adjacent image patches into consideration for both aforementioned models, while the state-of-the-art methods neglect the constraint either in the neighbor selection model or in the weight computation model. Extensive experiments on the Chinese University of Hong Kong face sketch database demonstrate that the proposed Bayesian method could achieve superior performance compared with the state-of-the-art methods in terms of both subjective perceptions and objective evaluations.
Energy Technology Data Exchange (ETDEWEB)
Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Idris, Azni, E-mail: azni@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia)
2013-05-15
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and
Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni
2013-05-01
Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes.
Ri, Yong-Wu; Im, Song-Jin
2014-01-01
The modified Beer-Lambert law (MBL) and the spatially resolved spectroscopy are used to measure the tissue oxidation in muscles and brains by the continuous wave near-infrared spectroscopy. The spatially resolved spectroscopy predicts the change in the concentration of the absorber by measuring the slope of attenuation data according to the separation and calculating the absorption coefficients of tissue on the basis of the slop in attenuation at the separation distance satisfying the linearity of this slop. This study analyzed the appropriate source-detector separation distance by using the diffuse approximation resolution for photon migration when predicting the absorption coefficient by the spatially resolved spectroscopy on the basis of the reflective image of the tissue. We imagine the 3 dimensional attenuation image with the absorption coefficient, reduced scattering coefficient and separation distance as its axes and obtained the attenuation data cube by calculating the attenuation on a certain interva...
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Hybrid Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2012-01-01
Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Center, Julian L.; Knuth, Kevin H.
2011-03-01
Visual odometry refers to tracking the motion of a body using an onboard vision system. Practical visual odometry systems combine the complementary accuracy characteristics of vision and inertial measurement units. The Mars Exploration Rovers, Spirit and Opportunity, used this type of visual odometry. The visual odometry algorithms in Spirit and Opportunity were based on Bayesian methods, but a number of simplifying approximations were needed to deal with onboard computer limitations. Furthermore, the allowable motion of the rover had to be severely limited so that computations could keep up. Recent advances in computer technology make it feasible to implement a fully Bayesian approach to visual odometry. This approach combines dense stereo vision, dense optical flow, and inertial measurements. As with all true Bayesian methods, it also determines error bars for all estimates. This approach also offers the possibility of using Micro-Electro Mechanical Systems (MEMS) inertial components, which are more economical, weigh less, and consume less power than conventional inertial components.
Bayesian network modelling of upper gastrointestinal bleeding
Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri
2013-09-01
Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.
Probabilistic Inferences in Bayesian Networks
Ding, Jianguo
2010-01-01
This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Institute of Scientific and Technical Information of China (English)
刘海林; 谢胜利; 章晋龙
2003-01-01
The paper researches separability of blind sources of convolutive mixtures on the assumption that the num-ber of sensors is less than that of sources. When time delay is small, the necessary condition of separability of the ill-conditioned convolutive mixtures is given through transforming blind sources of convolutive mixtures into that of lin-ear mixtures.
Bayesian methods for hackers probabilistic programming and Bayesian inference
Davidson-Pilon, Cameron
2016-01-01
Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...
A Tutorial Introduction to Bayesian Models of Cognitive Development
2011-01-01
Bayesian reasoner in the long run (de Finetti , 1937). Even if the Bayesian framework captures optimal inductive inference, does that mean it is an...Johns Hopkins University Press. de Finetti , B. (1937). Prevision, its logical laws, its subjective sources. In H. Kyburg & H. Smokler (Eds.), In...studies in subjective probability (2nd ed.). New York: J. Wiley and Sons. de Finetti , B. (1974). Theory of probability (2nd ed.). New York: J. Wiley and
Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua
2015-08-12
Understanding the factors that affect residents' waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors.
Bayesian information fusion networks for biosurveillance applications.
Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S
2009-01-01
This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.
Fedosseev, V; Marsh, B A; CERN. Geneva. AB Department
2006-01-01
At the ISOLDE on-line isotope separation facility, the resonance ionization laser ion source (RILIS) can be used to ionize reaction products as they effuse from the target. The RILIS process of laser step-wise resonance ionization of atoms in a hot metal cavity provides a highly element selective stage in the preparation of the radioactive ion beam. As a result, the ISOLDE mass separators can provide beams of a chosen isotope with greatly reduced isobaric contamination. The number of elements available at RILIS has been extended to 26, with the addition of a new three-step ionization scheme for gold. The optimal ionization scheme was determined during an extensive study of the atomic energy levels and auto-ionizing states of gold, carried out by means of in-source resonance ionization spectroscopy. Details of the ionization scheme and a summary of the spectroscopy study are presented.
Directory of Open Access Journals (Sweden)
W. Ouerghemmi
2015-08-01
Full Text Available The applicability of Visible, Near-Infrared and Short Wave Infrared (VNIR/SWIR hyperspectral imagery for soil property mapping decreases when surfaces are partially covered by vegetation. The objective of this research was to develop and evaluate a methodology based on the “double-extraction” technique, for clay content estimation over semi-vegetated surfaces using VNIR/SWIR hyperspectral airborne data. The “double-extraction” technique initially proposed by Ouerghemmi et al. (2011 consists of 1 an extraction of a soil reflectance spectrum ssoil from semi-vegetated spectra using a Blind Source Separation technique, and 2 an extraction of clay content from the soil reflectance spectrum ssoil, using a multivariate regression method. In this paper, the Source Separation approach is Semi-Blind thanks to the integration of field knowledge in Source Separation model. And the multivariate regression method is a partial least squares regression (PLSR model. This study employed VNIR/SWIR HyMap airborne data acquired in a French Mediterranean region over an area of 24 km2. Our results showed that our methodology based on the “double-extraction” technique is accurate for clay content estimation when applied to pixels under a specific Cellulose Absorption Index threshold. Finally the clay content can be estimated over around 70% of the semi-vegetated pixels of our study area, which may offer an extension of soil properties mapping, at the moment restricted to bare soils.
The Application of Compressed Sensing in Blind Source Separation%压缩感知原理在盲信号分离中的应用
Institute of Scientific and Technical Information of China (English)
王涛文
2012-01-01
Compressed Sensing has been a new signal sampling theory in recent years, for it overcomes the high rate of sampling defects of traditional Nyquist signal sampling theory, It presented the basic principles of Compressed Sensing, introduced three fundamental questions of Compressed Sensing-the sparse-ness of signals 、irrelevance between sparse matrix and measurement matrix, and reconstruction of the signals, and analyzed the contact between Compressed Sensing and Blind Source Separation. Then, it offered a new way to solve the problem of Blind Source Separation. Finally, through the experiment it showed its application in Blind Source Separation.%主要阐述了压缩感知的基本原理,介绍了压缩感知的3个基本问题:信号的稀疏表示、稀疏基与测量矩阵的不相关性和信号的重构,分析了它与盲信号分离之间的联系,为解决盲信号分离问题提供了一个新的途径.最后通过具体实验说明它在盲信号分离上的应用.
Subgroup finding via Bayesian additive regression trees.
Sivaganesan, Siva; Müller, Peter; Huang, Bin
2017-03-09
We provide a Bayesian decision theoretic approach to finding subgroups that have elevated treatment effects. Our approach separates the modeling of the response variable from the task of subgroup finding and allows a flexible modeling of the response variable irrespective of potential subgroups of interest. We use Bayesian additive regression trees to model the response variable and use a utility function defined in terms of a candidate subgroup and the predicted response for that subgroup. Subgroups are identified by maximizing the expected utility where the expectation is taken with respect to the posterior predictive distribution of the response, and the maximization is carried out over an a priori specified set of candidate subgroups. Our approach allows subgroups based on both quantitative and categorical covariates. We illustrate the approach using simulated data set study and a real data set. Copyright © 2017 John Wiley & Sons, Ltd.
Bloomfield, John; Ward, Rob; Garcia-Bajo, Marieta; Hart, Alwyn
2014-05-01
A number of potential pathways can be identified for the migration of methane and contaminants associated with the shale gas extraction process to aquifers. These include the possible movement of contaminants from shale gas reservoirs that have been hydraulically fractured to overlying aquifers. The risk of contamination of an overlying aquifer is a function of i.) the separation of the potential shale gas source rock and the aquifer, ii.) the hydraulic characteristics (e.g. hydraulic conductivity, storage and hydrogeochemistry) of the rocks in the intervening interval, and iii.) regional and local physio-chemical gradients. Here we report on a national-scale study from the UK to assess the former, i.e. the vertical separation between potential shale gas source rocks and major aquifers, as a contribution to more informed management of the risks associated with shale gas development if and when it takes place in the UK. Eleven aquifers are considered in the study. These are aquifers that have been designated by the environment agencies of England (Environment Agency) and Wales (Natural Resources Wales) under the EU Water Framework Directive as being nationally important (Principal Aquifers). The shale gas source rocks have been defined on best publically available evidence for potential gas productivity and include both shales and clay formations. Based on a national geological fence diagram consisting of ~80 geological sections, totalling ~12,000km in length, down to >5km in depth, and with a typical spacing of 30km, the lower surfaces of each aquifer unit and upper surfaces of each shale/clay unit have been estimated at a spatial resolution of 3x3km. These surfaces have then been used to estimate vertical separations between pairs of shale/clay and aquifer units. The modelling process will be described and the aquifer, shale and separation maps presented and discussed. The aquifers are defined by geological units and since these geological units may be found at
Directory of Open Access Journals (Sweden)
Chengjie Li
2016-01-01
Full Text Available In Passive Radar System, obtaining the mixed weak object signal against the super power signal (jamming is still a challenging task. In this paper, a novel framework based on Passive Radar System is designed for weak object signal separation. Firstly, we propose an Interference Cancellation algorithm (IC-algorithm to extract the mixed weak object signals from the strong jamming. Then, an improved FastICA algorithm with K-means cluster is designed to separate each weak signal from the mixed weak object signals. At last, we discuss the performance of the proposed method and verify the novel method based on several simulations. The experimental results demonstrate the effectiveness of the proposed method.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel N
2012-01-01
Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Energy Technology Data Exchange (ETDEWEB)
Baba, Justin S [ORNL; Akl, Tony [Texas A& M University; Cote, Gerard L. [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL
2011-01-01
An implanted system is being developed to monitor transplanted liver health during the critical 7-10 day period posttransplantation. The unit will monitor organ perfusion and oxygen consumption using optically-based probes placed on both the inflow and outflow blood vessels, and on the liver parenchymal surface. Sensing probes are based on a 3- wavelength LED source and a photodiode detector. Sample diffuse reflectance is measured at 735, 805, and 940 nm. To ascertain optimal source-to-photodetector spacing for perfusion measurement in blood vessels, an ex vivo study was conducted. In this work, a dye mixture simulating 80% blood oxygen saturation was developed and perfused through excised porcine arteries while collecting data for various preset probe source-to-photodetector spacings. The results from this study demonstrate a decrease in the optical signal with decreasing LED drive current and a reduction in perfusion index signal with increasing probe spacing. They also reveal a 2- to 4-mm optimal range for blood vessel perfusion probe source-to-photodetector spacing that allows for sufficient perfusion signal modulation depth with maximized signal to noise ratio (SNR). These findings are currently being applied to guide electronic configuration and probe placement for in vivo liver perfusion porcine model studies.
Separation and quantification of frequency coupled noise sources of submarine cabin%舱段模型频率耦合噪声源的分离量化
Institute of Scientific and Technical Information of China (English)
李思纯; 宫元彬; 时胜国; 于树华; 韩闯
2016-01-01
Traditional methods do not effectively handle separation and quantification of coupled vibration noise sources in submarines. So a new multivariate statistical analysis method, partial least square regression ( PLS) , is presented, which can be used to separate and quantify frequency coupled noise sources. PLS has the characteristic of simultaneously extracting principal input/output components, including maximum information, correlation of in⁃put with output, and regression modeling with multiple correlations among variables. Simulation and cabin model experiments show that, when there is frequency coupling between multiple excitation sources, PLS is capable of sorting among the energy contributions of internal noise sources to submarine hull, submarine hull to underwater a⁃coustic field, and noise sources to underwater acoustic field. The feasibility of PLS for frequency coupled source separation and quantification is proven. The method provides a basis for the control of the main noise sources.%由于潜艇振动噪声源存在频率相互耦合现象，常规方法难以有效地解决耦合噪声源分离与贡献量化问题。采用一种新型多元统计分析方法－偏最小二乘回归分析方法来实现频率耦合噪声源的分离量化，该方法可同时提取反映输入／输出中最大信息且相关性最大的主成分，并能够在变量间存在多重相关性的条件下进行回归建模。仿真与舱段模型试验表明：当多激励源之间存在频率耦合时，能对噪声源进行分离和贡献量化，从而实现了噪声源对耐压壳体观测点贡献以及噪声源对辐射声场观测点贡献的排序，验证了偏最小二乘回归用于频率耦合源分离量化的可行性，为主要噪声源的控制提供了依据。
Energy Technology Data Exchange (ETDEWEB)
Chen, Yi-Ren; Chou, Li-Chang; Yang, Ying-Jay [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Lin, Hao-Hsiung, E-mail: hhlin@ntu.edu.tw [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Department of Electrical Engineering and Graduate Institute of Photonics and Optoelectronics, National Taiwan University, Taipei 10617, Taiwan (China)
2012-04-30
This work describes a regular solution model that considers the free energy of the surface monolayer to explain the orientation-dependent phase separation in GaAsSb. In the proposed model, only the interaction between the second nearest-neighboring atoms sitting on the same monolayer contributes to the interaction parameter. Consequently, the parameter reduces to {Omega}/2 and {Omega}/3 for (111)B GaAsSb and (100) GaAsSb, where {Omega} denotes the parameter of bulk GaAsSb. By including the strain effect, the proposed model thoroughly elucidates the immiscibility behavior of (111)B GaAsSb and (100) GaAsSb. - Highlights: Black-Right-Pointing-Pointer (111)B GaAsSb exhibits severe phase separation than (100) GaAsSb. Black-Right-Pointing-Pointer We propose a model to calculate the monolayer free energy of different planes. Black-Right-Pointing-Pointer Monolayer model suggests that (111)B GaAsSb has larger interaction parameter. Black-Right-Pointing-Pointer Monolayer model including strain well explains the immiscibility of GaAsSb.
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the
Samadi, Samareh; Amini, Ladan; Cosandier-Rimélé, Delphine; Soltanian-Zadeh, Hamid; Jutten, Christian
2013-01-01
In this paper, we present a fast method to extract the sources related to interictal epileptiform state. The method is based on general eigenvalue decomposition using two correlation matrices during: 1) periods including interictal epileptiform discharges (IED) as a reference activation model and 2) periods excluding IEDs or abnormal physiological signals as background activity. After extracting the most similar sources to the reference or IED state, IED regions are estimated by using multiobjective optimization. The method is evaluated using both realistic simulated data and actual intracerebral electroencephalography recordings of patients suffering from focal epilepsy. These patients are seizure-free after the resective surgery. Quantitative comparisons of the proposed IED regions with the visually inspected ictal onset zones by the epileptologist and another method of identification of IED regions reveal good performance. PMID:23428609
Samadi, Samareh; Amini, Ladan; Cosandier-Rimélé, Delphine; Soltanian-Zadeh, Hamid; Jutten, Christian
2013-07-01
In this paper, we present a fast method to extract the sources related to interictal epileptiform state. The method is based on general eigenvalue decomposition using two correlation matrices during: 1) periods including interictal epileptiform discharges (IED) as a reference activation model and 2) periods excluding IEDs or abnormal physiological signals as background activity. After extracting the most similar sources to the reference or IED state, IED regions are estimated by using multiobjective optimization. The method is evaluated using both realistic simulated data and actual intracerebral electroencephalography recordings of patients suffering from focal epilepsy. These patients are seizure-free after the resective surgery. Quantitative comparisons of the proposed IED regions with the visually inspected ictal onset zones by the epileptologist and another method of identification of IED regions reveal good performance.
Multisnapshot Sparse Bayesian Learning for DOA
Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki; Nannuru, Santosh
2016-10-01
The directions of arrival (DOA) of plane waves are estimated from multi-snapshot sensor array data using Sparse Bayesian Learning (SBL). The prior source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters the unknown variances (i.e. the source powers). For a complex Gaussian likelihood with hyperparameter the unknown noise variance, the corresponding Gaussian posterior distribution is derived. For a given number of DOAs, the hyperparameters are automatically selected by maximizing the evidence and promote sparse DOA estimates. The SBL scheme for DOA estimation is discussed and evaluated competitively against LASSO ($\\ell_1$-regularization), conventional beamforming, and MUSIC
Bayesian parameter estimation for effective field theories
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Bayesian inference for pulsar timing models
Vigeland, Sarah J
2013-01-01
The extremely regular, periodic radio emission from millisecond pulsars make them useful tools for studying neutron star astrophysics, general relativity, and low-frequency gravitational waves. These studies require that the observed pulse time of arrivals are fit to complicated timing models that describe numerous effects such as the astrometry of the source, the evolution of the pulsar's spin, the presence of a binary companion, and the propagation of the pulses through the interstellar medium. In this paper, we discuss the benefits of using Bayesian inference to obtain these timing solutions. These include the validation of linearized least-squares model fits when they are correct, and the proper characterization of parameter uncertainties when they are not; the incorporation of prior parameter information and of models of correlated noise; and the Bayesian comparison of alternative timing models. We describe our computational setup, which combines the timing models of tempo2 with the nested-sampling integ...
Bayesian parameter estimation for effective field theories
Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A
2015-01-01
We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Directory of Open Access Journals (Sweden)
C. O. Torres-Cortés
2016-08-01
Full Text Available The aim of this work is twofold: to optimize the radiochemical separation of Plutonium (Pu from soil samples, and to measure the Pu concentration. Soil samples were prepared using acid digestion assisted by microwaves; then, Pu purification was carried out with Pu AG1X8 resin. Pu isotopes were measured using Mass Spectrometry with Magnetic Sector with Inductively Coupled Plasma source (ICP-SFMS. In order to reduce the interference due to the presence of 238UH+ in the samples a desolvation system (Apex was used. The limit of detection (LOD of Pu was determined. The efficiency of Pu recovery from soil samples varies from 70 to 93%.
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.;
2016-01-01
to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...... with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based...
A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics
Directory of Open Access Journals (Sweden)
Guangkuo Lu
2015-01-01
Full Text Available Methods of utilizing independent component analysis (ICA give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments by applying a modified segmentation algorithm. Then the state space is reconstructed and the single-channel signal is transformed into a pseudo multiple input multiple output (MIMO mode using a method of nonlinear analysis based on the high order statistics (HOS. In the last step, ICA is performed on the pseudo MIMO data to decompose the single channel recording into its underlying independent components (ICs and the interested ICs are then extracted. Finally, the effectiveness and excellence of the higher order single-channel ICA (SCICA method are validated with measured data throughout experiments. Also, the proposed method in this paper is proved to be more robust under different SNR and/or embedding dimension via explicit formulae and simulations.
Classification using Bayesian neural nets
J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)
1995-01-01
textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to neura
Bayesian Intersubjectivity and Quantum Theory
Pérez-Suárez, Marcos; Santos, David J.
2005-02-01
Two of the major approaches to probability, namely, frequentism and (subjectivistic) Bayesian theory, are discussed, together with the replacement of frequentist objectivity for Bayesian intersubjectivity. This discussion is then expanded to Quantum Theory, as quantum states and operations can be seen as structural elements of a subjective nature.
Bayesian Approach for Inconsistent Information.
Stein, M; Beer, M; Kreinovich, V
2013-10-01
In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach - e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression - which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster - because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods.
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael
2009-01-01
Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Energy Technology Data Exchange (ETDEWEB)
Braendli, Rahel C. [Agroscope Reckenholz-Taenikon Research Station ART, Analytical Chemistry, Reckenholzstrasse 191, CH-8046 Zurich (Switzerland); Ecole Polytechnique Federale de Lausanne (EPFL), Laboratory of Environmental Chemistry and Ecotoxicology (CECOTOX), Faculty of Architecture, Civil and Environmental Engineering, CH-1015 Lausanne (Switzerland); Bucheli, Thomas D. [Agroscope Reckenholz-Taenikon Research Station ART, Analytical Chemistry, Reckenholzstrasse 191, CH-8046 Zurich (Switzerland)]. E-mail: thomas.bucheli@art.admin.ch; Kupper, Thomas [Swiss Federal Institute of Aquatic Science and Technology, EAWAG, CH-8600 Duebendorf (Switzerland); Mayer, Jochen [Agroscope Reckenholz-Taenikon Research Station ART, Analytical Chemistry, Reckenholzstrasse 191, CH-8046 Zurich (Switzerland); Stadelmann, Franz X. [Agroscope Reckenholz-Taenikon Research Station ART, Analytical Chemistry, Reckenholzstrasse 191, CH-8046 Zuerich (Switzerland); Tarradellas, Joseph [Ecole Polytechnique Federale de Lausanne (EPFL), Laboratory of Environmental Chemistry and Ecotoxicology (CECOTOX), Faculty of Architecture, Civil and Environmental Engineering, CH-1015 Lausanne (Switzerland)
2007-07-15
Composting and digestion are important waste management strategies. However, the resulting products can contain significant amounts of organic pollutants such as polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs). In this study we followed the concentration changes of PCBs and PAHs during composting and digestion on field-scale for the first time. Concentrations of low-chlorinated PCBs increased during composting (about 30%), whereas a slight decrease was observed for the higher chlorinated congeners (about 10%). Enantiomeric fractions of atropisomeric PCBs were essentially racemic and stable over time. Levels of low-molecular-weight PAHs declined during composting (50-90% reduction), whereas high-molecular-weight compounds were stable. The PCBs and PAHs concentrations did not seem to vary during digestion. Source apportionment by applying characteristic PAH ratios and molecular markers in input material did not give any clear results. Some of these parameters changed considerably during composting. Hence, their diagnostic potential for finished compost must be questioned. - During field-scale composting, low molecular weight PCBs and PAHs increased and decreased, respectively, whereas high molecular weight compounds remained st0010ab.
Directory of Open Access Journals (Sweden)
Hawkins Steve AC
2006-10-01
Full Text Available Abstract Background Given the theoretical proposal that bovine spongiform encephalopathy (BSE could have originated from sheep scrapie, this study investigated the pathogenicity for cattle, by intracerebral (i.c. inoculation, of two pools of scrapie agents sourced in Great Britain before and during the BSE epidemic. Two groups of ten cattle were each inoculated with pools of brain material from sheep scrapie cases collected prior to 1975 and after 1990. Control groups comprised five cattle inoculated with sheep brain free from scrapie, five cattle inoculated with saline, and for comparison with BSE, naturally infected cattle and cattle i.c. inoculated with BSE brainstem homogenate from a parallel study. Phenotypic characterisation of the disease forms transmitted to cattle was conducted by morphological, immunohistochemical, biochemical and biological methods. Results Disease occurred in 16 cattle, nine inoculated with the pre-1975 inoculum and seven inoculated with the post-1990 inoculum, with four cattle still alive at 83 months post challenge (as at June 2006. The different inocula produced predominantly two different disease phenotypes as determined by histopathological, immunohistochemical and Western immunoblotting methods and biological characterisation on transmission to mice, neither of which was identical to BSE. Whilst the disease presentation was uniform in all scrapie-affected cattle of the pre-1975 group, the post-1990 inoculum produced a more variable disease, with two animals sharing immunohistochemical and molecular profile characteristics with animals in the pre-1975 group. Conclusion The study has demonstrated that cattle inoculated with different pooled scrapie sources can develop different prion disease phenotypes, which were not consistent with the phenotype of BSE of cattle and whose isolates did not have the strain typing characteristics of the BSE agent on transmission to mice.
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Bayesian networks for maritime traffic accident prevention: benefits and challenges.
Hänninen, Maria
2014-12-01
Bayesian networks are quantitative modeling tools whose applications to the maritime traffic safety context are becoming more popular. This paper discusses the utilization of Bayesian networks in maritime safety modeling. Based on literature and the author's own experiences, the paper studies what Bayesian networks can offer to maritime accident prevention and safety modeling and discusses a few challenges in their application to this context. It is argued that the capability of representing rather complex, not necessarily causal but uncertain relationships makes Bayesian networks an attractive modeling tool for the maritime safety and accidents. Furthermore, as the maritime accident and safety data is still rather scarce and has some quality problems, the possibility to combine data with expert knowledge and the easy way of updating the model after acquiring more evidence further enhance their feasibility. However, eliciting the probabilities from the maritime experts might be challenging and the model validation can be tricky. It is concluded that with the utilization of several data sources, Bayesian updating, dynamic modeling, and hidden nodes for latent variables, Bayesian networks are rather well-suited tools for the maritime safety management and decision-making.
Approximate Bayesian computation.
Directory of Open Access Journals (Sweden)
Mikael Sunnåker
Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.
Borsboom, D.; Haig, B.D.
2013-01-01
Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar
Directory of Open Access Journals (Sweden)
Shu-Yin Chiang
2002-01-01
Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.
Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions
Directory of Open Access Journals (Sweden)
Richard M. Todd
1988-03-01
Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.
Bayesian Calibration of Microsimulation Models.
Rutter, Carolyn M; Miglioretti, Diana L; Savarino, James E
2009-12-01
Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
Ortega, Pedro A
2011-01-01
Discovering causal relationships is a hard task, often hindered by the need for intervention, and often requiring large amounts of data to resolve statistical uncertainty. However, humans quickly arrive at useful causal relationships. One possible reason is that humans use strong prior knowledge; and rather than encoding hard causal relationships, they encode beliefs over causal structures, allowing for sound generalization from the observations they obtain from directly acting in the world. In this work we propose a Bayesian approach to causal induction which allows modeling beliefs over multiple causal hypotheses and predicting the behavior of the world under causal interventions. We then illustrate how this method extracts causal information from data containing interventions and observations.
Blundell, Charles; Heller, Katherine A
2012-01-01
Hierarchical structure is ubiquitous in data across many domains. There are many hier- archical clustering methods, frequently used by domain experts, which strive to discover this structure. However, most of these meth- ods limit discoverable hierarchies to those with binary branching structure. This lim- itation, while computationally convenient, is often undesirable. In this paper we ex- plore a Bayesian hierarchical clustering algo- rithm that can produce trees with arbitrary branching structure at each node, known as rose trees. We interpret these trees as mixtures over partitions of a data set, and use a computationally efficient, greedy ag- glomerative algorithm to find the rose trees which have high marginal likelihood given the data. Lastly, we perform experiments which demonstrate that rose trees are better models of data than the typical binary trees returned by other hierarchical clustering algorithms.
Bayesian inference in geomagnetism
Backus, George E.
1988-01-01
The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.
Comerford, Julia M; Gerke, Brian F; Madejski, Greg M
2011-01-01
We report Chandra observations of a double X-ray source in the z=0.1569 galaxy SDSS J171544.05+600835.7. The galaxy was initially identified as a dual AGN candidate based on the double-peaked [O III] emission lines, with a line-of-sight velocity separation of 350 km/s, in its Sloan Digital Sky Survey spectrum. We used the Kast Spectrograph at Lick Observatory to obtain two longslit spectra of the galaxy at two different position angles, which reveal that the two AGN emission components have not only a velocity offset, but also a projected spatial offset of 1.9 kpc/h70 on the sky. Chandra/ACIS observations of two X-ray sources with the same spatial offset and orientation as the optical emission suggest the galaxy most likely contains Compton-thick dual AGN, although the observations could also be explained by AGN jets. Deeper X-ray observations that reveal Fe K lines, if present, would distinguish between the two scenarios. The observations of a double X-ray source in SDSS J171544.05+600835.7 are a proof of co...
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Embedding the results of focussed Bayesian fusion into a global context
Sander, Jennifer; Heizmann, Michael
2014-05-01
Bayesian statistics offers a well-founded and powerful fusion methodology also for the fusion of heterogeneous information sources. However, except in special cases, the needed posterior distribution is not analytically derivable. As consequence, Bayesian fusion may cause unacceptably high computational and storage costs in practice. Local Bayesian fusion approaches aim at reducing the complexity of the Bayesian fusion methodology significantly. This is done by concentrating the actual Bayesian fusion on the potentially most task relevant parts of the domain of the Properties of Interest. Our research on these approaches is motivated by an analogy to criminal investigations where criminalists pursue clues also only locally. This publication follows previous publications on a special local Bayesian fusion technique called focussed Bayesian fusion. Here, the actual calculation of the posterior distribution gets completely restricted to a suitably chosen local context. By this, the global posterior distribution is not completely determined. Strategies for using the results of a focussed Bayesian analysis appropriately are needed. In this publication, we primarily contrast different ways of embedding the results of focussed Bayesian fusion explicitly into a global context. To obtain a unique global posterior distribution, we analyze the application of the Maximum Entropy Principle that has been shown to be successfully applicable in metrology and in different other areas. To address the special need for making further decisions subsequently to the actual fusion task, we further analyze criteria for decision making under partial information.
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
Advanced REACH tool: A Bayesian model for occupational exposure assessment
McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.
2014-01-01
This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sourc
Energy Technology Data Exchange (ETDEWEB)
Amirov, R. Kh., E-mail: ravus46@yandex.ru; Vorona, N. A.; Gavrikov, A. V.; Lizyakin, G. D.; Polishchuk, V. P.; Samoilov, I. S.; Smirnov, V. P.; Usmanov, R. A.; Yartsev, I. M. [Russian Academy of Sciences, Joint Institute for High Temperatures (Russian Federation)
2015-10-15
Results from experimental studies of a vacuum arc with a distributed cathode spot on the heated cathode are presented. Such an arc can be used as a plasma source for plasma separation of spent nuclear fuel and radioactive waste. The experiments were performed with a gadolinium cathode, the properties of which are similar to those of an uranium arc cathode. The heat flux from the plasma to the cathode (and its volt equivalent) at discharge voltages of 4-15 V and discharge currents of 44-81 A, the radial distribution of the emission intensity of gadolinium atoms and singly charged ions in the arc channel at a voltage of 4.3 V, and the plasma electron temperature behind the anode were measured. The average charge of plasma ions at arc voltages of 3.5-8 V and a discharge current of 52 A and the average rate of gadolinium evaporation in the discharge were also determined.
Energy Technology Data Exchange (ETDEWEB)
Mourant, J.R.; Boyer, J.; Hielscher, A.H.; Bigio, I.J. [Bioscience and Biotechnology Group CST-4, MS E535, Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States)
1996-04-01
Many methods of optical tissue diagnosis require that measurements be performed with small source{endash}detector separations in a backscatter geometry. Monte Carlo simulations are used to demonstrate that for these situations light transport depends on the exact form of the angular scattering probability distribution, {ital P}({theta}). Simulations performed with different forms of {ital P}({theta}) with the same value of {l_angle}cos{theta}{r_angle} result in the collection of significantly different fractions of the incident photons, particularly when small-numerical-aperture delivery and collection fibers are employed. More photons are collected for the distribution that has a higher probability of scattering events with {theta}{approx_gt}125{degree}. For the clinically relevant optical parameters employed here, the differences in light collection are {approx_gt}60{percent}. {copyright} {ital 1996 Optical Society of America.}
Tsyganov, Y S
2015-01-01
General philosophy of procedure of detecting rare events in the recent experiments with 48Ca projectile at the Dubna Gas-Filled Recoil Separator(DGFRS) aimed to the synthesis of superheavy elements (SHE) has been reviewed. Specific instruments and methods are under consideration. Some historical sources of the successful experiments for Z=112-118 are considered too. Special attention is paid to application of method of active correlations in heavy-ion induced complete fusion nuclear reactions. Example of application in Z=115 experiment is presented. Brief description of the 243Am + 48Ca -> 291-x115+xn experiment is presented too. Some attention is paid to the role of chemical experiments in discoveries of SHEs. The DGFRS detection/monitoring system is presented in full firstly.
Schmidt, M; Peng, H; Zschornack, G; Sykora, S
2009-06-01
A Wien filter was designed for and tested with a room temperature electron beam ion source (EBIS). Xenon charge state spectra up to the charge state Xe46+ were resolved as well as the isotopes of krypton using apertures of different sizes. The complete setup consisting of an EBIS and a Wien filter has a length of less than 1 m substituting a complete classical beamline setup. The Wien filter is equipped with removable permanent magnets. Hence total beam current measurements are possible via simple removal of the permanent magnets. In dependence on the needs of resolution a weak (0.2 T) or a strong (0.5 T) magnets setup can be used. In this paper the principle of operation and the design of the Wien filter meeting the requirements of an EBIS are briefly discussed. The first ion beam extraction and separation experiments with a Dresden EBIS are presented.
Yadav, Kunwar D; Tare, Vinod; Ahammed, M Mansoor
2011-06-01
The main objective of the present study was to determine the optimum stocking density for feed consumption rate, biomass growth and reproduction of earthworm Eisenia fetida as well as determining and characterising vermicompost quantity and product, respectively, during vermicomposting of source-separated human faeces. For this, a number of experiments spanning up to 3 months were conducted using soil and vermicompost as support materials. Stocking density in the range of 0.25-5.00 kg/m(2) was employed in different tests. The results showed that 0.40-0.45 kg-feed/kg-worm/day was the maximum feed consumption rate by E. fetida in human faeces. The optimum stocking densities were 3.00 kg/m(2) for bioconversion of human faeces to vermicompost, and 0.50 kg/m(2) for earthworm biomass growth and reproduction.
Irregular-Time Bayesian Networks
Ramati, Michael
2012-01-01
In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...
Energy Technology Data Exchange (ETDEWEB)
Amirov, R. Kh.; Vorona, N. A.; Gavrikov, A. V.; Liziakin, G. D.; Polistchook, V. P.; Samoylov, I. S.; Smirnov, V. P.; Usmanov, R. A., E-mail: ravus46@yandex.ru; Yartsev, I. M. [Russian Academy of Sciences, Joint Institute for High Temperatures (Russian Federation)
2015-12-15
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics.
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Bayesian Methods for Statistical Analysis
Puza, Borek
2015-01-01
Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...
Bayesian Inference in the Modern Design of Experiments
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Uncertainty Modeling Based on Bayesian Network in Ontology Mapping
Institute of Scientific and Technical Information of China (English)
LI Yuhua; LIU Tao; SUN Xiaolin
2006-01-01
How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.
Technical note: Bayesian calibration of dynamic ruminant nutrition models.
Reed, K F; Arhonditsis, G B; France, J; Kebreab, E
2016-08-01
Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling.
Dynamic Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2011-01-01
Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...
Case studies in Bayesian microbial risk assessments
Directory of Open Access Journals (Sweden)
Turner Joanne
2009-12-01
Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second
Bayesian Analysis of High Dimensional Classification
Mukhopadhyay, Subhadeep; Liang, Faming
2009-12-01
Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.
Meeravali, Noorbasha N; Madhavi, K; Manjusha, R; Kumar, Sunil Jai
2014-01-01
A sequential extraction procedure is developed for the separation of trace levels of hexachloroplatinate, cisplatin and carboplatin from soil, which are then, pre-concentrated using a vesicular coacervative cloud point extraction method prior to their determination as platinum by continuum source ETAAS. Sequential extraction of carboplatin, cisplatin and hexachloroplatinate from a specific red soil is achieved by using the 20% HCl, aqua regia at room temperature and by combination of aqua regia and HF with microwave digestion, respectively. The pre-concentration of these species from the extracted solutions is based on the formation of extractable hydrophobic complexes of PtCl₆(2-) anionic species with free cationic head groups solubilizing sites of the Triton X-114 co-surfactant stabilized TOMAC (tri-octyl methyl ammonium chloride) vesicles through electrostatic attraction. This process separates the platinum from bulk aqueous solution into a small vesicular rich phase. The parameters affecting the extraction procedures are optimized. Under the optimized conditions, the achieved pre-concentration factor is 20 and detection limit is 0.5 ng g(-1) for soil and 0.02 ng mL(-1) for water samples. The spiked recoveries of hexachloroplatinate, cisplatin and carboplatin in water and soil extracts in the vesicular coacervative extraction are in the range of 96-102% at 0.5-1 ng mL(-1) with relative standard deviation of 1-3%. The accuracy of the method for platinum determination is evaluated by analyzing CCRMP PTC-1a copper-nickel sulfide concentrate and BCR 723 road dust certified reference materials and the obtained results agreed with the certified values with 95% confidence level of student t-test. The results were also compared to mixed-micelle (MM)-CPE method reported in the literature.
2015-01-01
SOURCES MANUSCRITES Archives nationales Rôles de taille 1768/71 Z1G-344/18 Aulnay Z1G-343a/02 Gennevilliers Z1G-340/01 Ivry Z1G-340/05 Orly Z1G-334c/09 Saint-Remy-lès-Chevreuse Z1G-344/18 Sevran Z1G-340/05 Thiais 1779/80 Z1G-391a/18 Aulnay Z1G-380/02 Gennevilliers Z1G-385/01 Ivry Z1G-387b/05 Orly Z1G-388a/09 Saint-Remy-lès-Chevreuse Z1G-391a/18 Sevran Z1G-387b/05 Thiais 1788/89 Z1G-451/18 Aulnay Z1G-452/21 Chennevières Z1G-443b/02 Gennevilliers Z1G-440a/01 Ivry Z1G-452/17 Noiseau Z1G-445b/05 ...
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Bayesian microsaccade detection
Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji
2017-01-01
Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483
Maximum margin Bayesian network classifiers.
Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian
2012-03-01
We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.
A Bayesian estimation of the helioseismic solar age
Bonanno, Alfio
2015-01-01
The helioseismic determination of the solar age has been a subject of several studies because it provides us with an independent estimation of the age of the solar system. We present the Bayesian estimates of the helioseismic age of the Sun, which are determined by means of calibrated solar models that employ different equations of state and nuclear reaction rates. We use 17 frequency separation ratios $r_{02}(n)=(\
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Isomer separation of {sup 70g}Cu and {sup 70m}Cu with a resonance ionization laser ion source
Energy Technology Data Exchange (ETDEWEB)
Koester, U. E-mail: ulli.koster@cern.ch; Fedoseyev, V.N.; Mishin, V.I.; Weissman, L.; Huyse, M.; Kruglov, K.; Mueller, W.F.; Duppen, P. van; Roosbroeck, J. van; Thirolf, P.; Thomas, H.G.; Weisshaar, D.; Schulze, W.; Borcea, R.; La Commara, M.; Schatz, H.; Schmidt, K.; Roettger, S.; Huber, G.; Sebastian, V.; Kratz, K.L.; Catherall, R.; Georg, U.; Lettry, J.; Oinonen, M.; Ravn, H.L.; Simon, H
2000-04-01
Radioactive copper isotopes were ionized with the resonance ionization laser ion source (RILIS) at the on-line isotope separator ISOLDE (CERN). Using the different hyperfine structure in the 3d{sup 10} 4s {sup 2}S{sub 1/2} - 3d{sup 10} 4p {sup 2}P{sup 0}{sub 1/2} transition the low- and high-spin isomers of {sup 70}Cu were selectively enhanced by tuning the laser wavelength. The light was provided by a narrow-bandwidth dye laser pumped by copper vapor lasers (CVL) and frequency doubled in a BBO crystal. The ground state to isomeric state intensity ratio could be varied by a factor of 30, allowing to assign gamma transitions unambiguously to the decay of the individual isomers. It is shown that the method can also be used to determine magnetic moments. In a first experiment for the 1{sup +} ground state of {sup 70}Cu a magnetic moment of (+)1.8(3) {mu}{sub N} and for the high-spin isomer of {sup 70}Cu a magnetic moment of ({+-})1.2(3) {mu}{sub N} could be deduced.
Bayesian Missile System Reliability from Point Estimates
2014-10-28
OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Bayesian Missile System Reliability from Point Estimates 5a. CONTRACT...Principle (MEP) to convert point estimates to probability distributions to be used as priors for Bayesian reliability analysis of missile data, and...illustrate this approach by applying the priors to a Bayesian reliability model of a missile system. 15. SUBJECT TERMS priors, Bayesian , missile
Perception, illusions and Bayesian inference.
Nour, Matthew M; Nour, Joseph M
2015-01-01
Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
A Bayesian Nonparametric Approach to Test Equating
Karabatsos, George; Walker, Stephen G.
2009-01-01
A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have b...
MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES
Directory of Open Access Journals (Sweden)
H. Sadeq
2016-06-01
Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Merging Digital Surface Models Implementing Bayesian Approaches
Sadeq, H.; Drummond, J.; Li, Z.
2016-06-01
In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Application of Wavelet Denoising Algorithm in Noisy Blind Source Separation%小波去噪算法在含噪盲源分离中的应用
Institute of Scientific and Technical Information of China (English)
吴微; 彭华; 王彬
2015-01-01
Blind source separation (BSS) algorithms based on the noise‐free model are not applicable when the SNR is low .To deal with this issue ,one way is to denoise the mixtures corrupted by white Gaussian noise ,firstly ,and then utilize the BSS algorithms .Therefore ,a Waveshrink algorithm is proposed based on translation invariant to denoise mixtures with strong noise .The high‐frequency coefficients sliding window method is utilized to estimate the noise variance accurately ,and BayesShrink algorithm is utilized for a more reasonable threshold .Consequently ,the scope of the translation invariant is narrowed without degrading the performance of denoising ,thus reducing the computation amount .Simulation results indi‐cate that the proposed approach perform better in denoising compared with the traditional Waveshrink al‐gorithm ,and can remarkably enhance the separation performance of BSS algorithms ,especially in the case with low signal SNRs .%无噪模型下的盲源分离算法在信噪比较低的情况下并不适用。针对该情况一种解决方案就是先对含有高斯白噪声的混合信号进行去噪预处理，然后使用盲源分离算法进行分离。为此，本文提出了一种适用于信噪比较低条件下的基于平移不变量的小波去噪算法。该算法首先使用高频系数滑动窗口法准确估计含噪混合信号的噪声方差，然后使用Bayesshrink阈值估计算法得到更加合理的阈值，最后在不降低去噪效果的同时缩小了平移不变量的范围，减少了运算量。实验仿真表明，在信噪比较低的情况下，与传统小波去噪算法相比，该算法可以更加有效地去除噪声，在很大程度上提升盲源分离算法的性能。
Naive Bayesian for Email Filtering
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
The paper presents a method of email filter based on Naive Bayesian theory that can effectively filter junk mail and illegal mail. Furthermore, the keys of implementation are discussed in detail. The filtering model is obtained from training set of email. The filtering can be done without the users specification of filtering rules.
Bayesian analysis of binary sequences
Torney, David C.
2005-03-01
This manuscript details Bayesian methodology for "learning by example", with binary n-sequences encoding the objects under consideration. Priors prove influential; conformable priors are described. Laplace approximation of Bayes integrals yields posterior likelihoods for all n-sequences. This involves the optimization of a definite function over a convex domain--efficiently effectuated by the sequential application of the quadratic program.
Bayesian NL interpretation and learning
Zeevat, H.
2011-01-01
Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language
ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY
Directory of Open Access Journals (Sweden)
Felipe Schneider Costa
2013-01-01
Full Text Available The naÃ¯ve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naÃ¯ve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naÃ¯ve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naÃ¯ve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
Bayesian Classification of Image Structures
DEFF Research Database (Denmark)
Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert
2009-01-01
In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...
3-D contextual Bayesian classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian Evidence and Model Selection
Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben
2014-01-01
In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.
Differentiated Bayesian Conjoint Choice Designs
Z. Sándor (Zsolt); M. Wedel (Michel)
2003-01-01
textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about
Separating Underdetermined Convolutive Speech Mixtures
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan
2006-01-01
a method for underdetermined blind source separation of convolutive mixtures. The proposed framework is applicable for separation of instantaneous as well as convolutive speech mixtures. It is possible to iteratively extract each speech signal from the mixture by combining blind source separation...
A Comparison of BBN, ADTree and MLP in separating Quasars from Large Survey Catalogues
Institute of Scientific and Technical Information of China (English)
Yan-Xia Zhang; Yong-Heng Zhao
2007-01-01
We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perception (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FIRST survey catalogs. Having a training sample of sources of known object types, the classifiers are trained to separate quasars from stars. By the statistical properties of the sample, the features important for classification are selected. We compare the classification results with and without feature selection.Experiments show that the results with feature selection are better than those without feature selection. From the high accuracy found, it is concluded that these automated methods are robust and effective for classifying point sources. They may all be applied to large survey projects (e.g. selecting input catalogs) and for other astronomical issues, such as the parameter measurement of stars and the redshift estimation of galaxies and quasars.
BBN, ADTree and MLP Comparison in Separating Quasars from Large Survey Catalogues
Zhang, Y
2006-01-01
We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perceptron (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FIRST survey catalogs. Having a training sample of sources of known object types, the classifiers are trained to separate quasars from stars. By the statistical properties of the sample, the features important for classification are selected. We compare the classification results with and without feature selection. Experiments show that the results with feature selection are better than those without feature selection. From the high accuracy, it is concluded that these automated methods are robust and effective to classify point sources, moreover they all may be applied for large survey projects (e.g. selecting input catalogs) and for other astronomical issues, such as the parameter measurement of stars and the redshift estimation of galaxies and quasars.
Bayesian Alternation During Tactile Augmentation
Directory of Open Access Journals (Sweden)
Caspar Mathias Goeke
2016-10-01
Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in
Discussion on Nonlinear Functions of the Blind Source Separation%关于盲信号自适应分离中非线性函数的讨论
Institute of Scientific and Technical Information of China (English)
谢胜利; 何昭水; 章晋龙; 傅予力
2005-01-01
This paper proposes a new algorithm of blind source separation (BSS). The algorithm can overcome the difficulty known as "the sensors are less than the source signals" and works effectively when the sensors are less. Then, the paper discusses the nonlinear functions used in the new algorithm. A uniform nonlinear function is proposed and some criterion are given to choose its parameters. Finally, some simulations are presented to show the effectness of the algorithm and the correctness of the criterion.
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-12-01
This study compared the environmental profiles of anaerobic digestion (AD) and incineration, in relation to global warming potential (GWP), for treating individual material fractions that may occur in source-separated organic household waste (SSOHW). Different framework conditions representative for the European Union member countries were considered. For AD, biogas utilisation with a biogas engine was considered and two potential situations investigated - biogas combustion with (1) combined heat and power production (CHP) and (2) electricity production only. For incineration, four technology options currently available in Europe were covered: (1) an average incinerator with CHP production, (2) an average incinerator with mainly electricity production, (3) an average incinerator with mainly heat production and (4) a state-of-the art incinerator with CHP working at high energy recovery efficiencies. The study was performed using a life cycle assessment in its consequential approach. Furthermore, the role of waste-sorting guidelines (defined by the material fractions allowed for SSOHW) in relation to GWP of treating overall SSOHW with AD was investigated. A case-study of treating 1tonne of SSOHW under framework conditions in Denmark was conducted. Under the given assumptions, vegetable food waste was the only material fraction which was always better for AD compared to incineration. For animal food waste, kitchen tissue, vegetation waste and dirty paper, AD utilisation was better unless it was compared to a highly efficient incinerator. Material fractions such as moulded fibres and dirty cardboard were attractive for AD, albeit only when AD with CHP and incineration with mainly heat production were compared. Animal straw, in contrast, was always better to incinerate. Considering the total amounts of individual material fractions in waste generated within households in Denmark, food waste (both animal and vegetable derived) and kitchen tissue are the main material
Bayesian analysis of rare events
Energy Technology Data Exchange (ETDEWEB)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Institute of Scientific and Technical Information of China (English)
王彪; 朱志慧; 戴跃伟
2016-01-01
现有的基于CS-MMV（Compressed Sensing-Multiple Measurement Vectors）模型的DOA估计一般都假定信号源为独立同分布（ i．i．d），算法建立在信号的空间结构上进行分析，而当处理具有时序结构的源信号时表现出性能和鲁棒性差的问题，为此该文提出一种具有时序结构的稀疏贝叶斯学习的DOA算法，该方法通过建立一阶自回归过程（ AR）来描述具有时序结构的水声信号，将信号源的时间结构特性充分应用到DOA估计模型中，然后采用针对多测量矢量的稀疏贝叶斯学习（ Muti-vectors Sparse Bayesian Learning ）算法重构信号空间谱，建立多重测量向量中恢复未知稀疏源的信号的CS（ Compressed Sensing ）模型，最终完成DOA估计．仿真结果表明该方法相对于传统的算法具有更高的空间分辨率和估计精度的特点，且抗干扰能力强．%Assuming independently but identically distributed sources,most existing DOA algorithms based on the CS-MMV model are analyzed according to the spatial structure of the signals.The temporal correlation between the sources,how-ever,results in poor performance and robustness.To overcome this problem,we propose a DOA estimation algorithm based on Sparse Bayesian Learning ( SBL) with temporally correlated source vectors.In this method,an underwater acoustic source is regarded as a first-order autoregressive process,with time structure characteristics being applied to DOA estimation model.Af-ter that,the multi-vector SBL algorithm is used to reconstruct the signal spatial spectrum.Then the CS-MMV model of the un-known sparse vector signal sources is established to estimate the DOA.Through simulation,it shows that the proposed algo-rithm provides a higher spatial resolution and estimation accuracy in comparison to many other current algorithms.
STATISTICAL BAYESIAN ANALYSIS OF EXPERIMENTAL DATA.
Directory of Open Access Journals (Sweden)
AHLAM LABDAOUI
2012-12-01
Full Text Available The Bayesian researcher should know the basic ideas underlying Bayesian methodology and the computational tools used in modern Bayesian econometrics. Some of the most important methods of posterior simulation are Monte Carlo integration, importance sampling, Gibbs sampling and the Metropolis- Hastings algorithm. The Bayesian should also be able to put the theory and computational tools together in the context of substantive empirical problems. We focus primarily on recent developments in Bayesian computation. Then we focus on particular models. Inevitably, we combine theory and computation in the context of particular models. Although we have tried to be reasonably complete in terms of covering the basic ideas of Bayesian theory and the computational tools most commonly used by the Bayesian, there is no way we can cover all the classes of models used in econometrics. We propose to the user of analysis of variance and linear regression model.
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Directory of Open Access Journals (Sweden)
Futoshi Asano
2004-09-01
Full Text Available A method of detecting speech events in a multiple-sound-source condition using audio and video information is proposed. For detecting speech events, sound localization using a microphone array and human tracking by stereo vision is combined by a Bayesian network. From the inference results of the Bayesian network, information on the time and location of speech events can be known. The information on the detected speech events is then utilized in the robust speech interface. A maximum likelihood adaptive beamformer is employed as a preprocessor of the speech recognizer to separate the speech signal from environmental noise. The coefficients of the beamformer are kept updated based on the information of the speech events. The information on the speech events is also used by the speech recognizer for extracting the speech segment.
Confirmation via Analogue Simulation: A Bayesian Analysis
Dardashti, Radin; Thebault, Karim P Y; Winsberg, Eric
2016-01-01
Analogue simulation is a novel mode of scientific inference found increasingly within modern physics, and yet all but neglected in the philosophical literature. Experiments conducted upon a table-top 'source system' are taken to provide insight into features of an inaccessible 'target system', based upon a syntactic isomorphism between the relevant modelling frameworks. An important example is the use of acoustic 'dumb hole' systems to simulate gravitational black holes. In a recent paper it was argued that there exists circumstances in which confirmation via analogue simulation can obtain; in particular when the robustness of the isomorphism is established via universality arguments. The current paper supports these claims via an analysis in terms of Bayesian confirmation theory.
Bayesian calibration of simultaneity in audiovisual temporal order judgments.
Directory of Open Access Journals (Sweden)
Shinya Yamamoto
Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.
Constraining duty cycles through a Bayesian technique
Romano, P; Segreto, A; Ducci, L; Vercellone, S
2014-01-01
The duty cycle (DC) of astrophysical sources is generally defined as the fraction of time during which the sources are active. However, DCs are generally not provided with statistical uncertainties, since the standard approach is to perform Monte Carlo bootstrap simulations to evaluate them, which can be quite time consuming for a large sample of sources. As an alternative, considerably less time-consuming approach, we derived the theoretical expectation value for the DC and its error for sources whose state is one of two possible, mutually exclusive states, inactive (off) or flaring (on), as based on a finite set of independent observational data points. Following a Bayesian approach, we derived the analytical expression for the posterior, the conjugated distribution adopted as prior, and the expectation value and variance. We applied our method to the specific case of the inactivity duty cycle (IDC) for supergiant fast X-ray transients. We also studied IDC as a function of the number of observations in the ...
A Bayesian Estimator of Protein-Protein Association Probabilities
Energy Technology Data Exchange (ETDEWEB)
Gilmore, Jason M.; Auberry, Deanna L.; Sharp, Julia L.; White, Amanda M.; Anderson, Kevin K.; Daly, Don S.
2008-07-01
The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein pull-down LC-MS assay experiments. BEPro3 is open source software that runs on both Windows XP and Mac OS 10.4 or newer versions, and is freely available from http://www.pnl.gov/statistics/BEPro3.
Bayesian decision making in human collectives with binary choices
Eguíluz, Víctor M; Fernández-Gracia, J
2015-01-01
Here we focus on the description of the mechanisms behind the process of information aggregation and decision making, a basic step to understand emergent phenomena in society, such as trends, information spreading or the wisdom of crowds. In many situations, agents choose between discrete options. We analyze experimental data on binary opinion choices in humans. The data consists of two separate experiments in which humans answer questions with a binary response, where one is correct and the other is incorrect. The questions are answered without and with information on the answers of some previous participants. We find that a Bayesian approach captures the probability of choosing one of the answers. The influence of peers is uncorrelated with the difficulty of the question. The data is inconsistent with Weber's law, which states that the probability of choosing an option depends on the proportion of previous answers choosing that option and not on the total number of those answers. Last, the present Bayesian ...
Bayesian Methods for Analysis and Adaptive Scheduling of Exoplanet Observations
Loredo, Thomas J; Chernoff, David F; Clyde, Merlise A; Liu, Bin
2011-01-01
We describe work in progress by a collaboration of astronomers and statisticians developing a suite of Bayesian data analysis tools for extrasolar planet (exoplanet) detection, planetary orbit estimation, and adaptive scheduling of observations. Our work addresses analysis of stellar reflex motion data, where a planet is detected by observing the "wobble" of its host star as it responds to the gravitational tug of the orbiting planet. Newtonian mechanics specifies an analytical model for the resulting time series, but it is strongly nonlinear, yielding complex, multimodal likelihood functions; it is even more complex when multiple planets are present. The parameter spaces range in size from few-dimensional to dozens of dimensions, depending on the number of planets in the system, and the type of motion measured (line-of-sight velocity, or position on the sky). Since orbits are periodic, Bayesian generalizations of periodogram methods facilitate the analysis. This relies on the model being linearly separable, ...
Bayesian versus 'plain-vanilla Bayesian' multitarget statistics
Mahler, Ronald P. S.
2004-08-01
Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."
A Bayesian Framework for SNP Identification
Energy Technology Data Exchange (ETDEWEB)
Webb-Robertson, Bobbie-Jo M.; Havre, Susan L.; Payne, Deborah A.
2005-07-01
Current proteomics techniques, such as mass spectrometry, focus on protein identification, usually ignoring most types of modifications beyond post-translational modifications, with the assumption that only a small number of peptides have to be matched to a protein for a positive identification. However, not all proteins are being identified with current techniques and improved methods to locate points of mutation are becoming a necessity. In the case when single-nucleotide polymorphisms (SNPs) are observed, brute force is the most common method to locate them, quickly becoming computationally unattractive as the size of the database associated with the model organism grows. We have developed a Bayesian model for SNPs, BSNP, incorporating evolutionary information at both the nucleotide and amino acid levels. Formulating SNPs as a Bayesian inference problem allows probabilities of interest to be easily obtained, for example the probability of a specific SNP or specific type of mutation over a gene or entire genome. Three SNP databases were observed in the evaluation of the BSNP model; the first SNP database is a disease specific gene in human, hemoglobin, the second is also a disease specific gene in human, p53, and the third is a more general SNP database for multiple genes in mouse. We validate that the BSNP model assigns higher posterior probabilities to the SNPs defined in all three separate databases than can be attributed to chance under specific evolutionary information, for example the amino acid model described by Majewski and Ott in conjunction with either the four-parameter nucleotide model by Bulmer or seven-parameter nucleotide model by Majewski and Ott.
Bayesian priors for transiting planets
Kipping, David M
2016-01-01
As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...
Bayesian approach to rough set
Marwala, Tshilidzi
2007-01-01
This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.
Deep Learning and Bayesian Methods
Directory of Open Access Journals (Sweden)
Prosper Harrison B.
2017-01-01
Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.
Deep Learning and Bayesian Methods
Prosper, Harrison B.
2017-03-01
A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.
Bayesian Inference for Radio Observations
Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin
2015-01-01
(Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...
Bayesian inference on proportional elections.
Directory of Open Access Journals (Sweden)
Gabriel Hideki Vatanabe Brunello
Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.
Bayesian analysis for kaon photoproduction
Energy Technology Data Exchange (ETDEWEB)
Marsainy, T., E-mail: tmart@fisika.ui.ac.id; Mart, T., E-mail: tmart@fisika.ui.ac.id [Department Fisika, FMIPA, Universitas Indonesia, Depok 16424 (Indonesia)
2014-09-25
We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.
Bayesian priors and nuisance parameters
Gupta, Sourendu
2016-01-01
Bayesian techniques are widely used to obtain spectral functions from correlators. We suggest a technique to rid the results of nuisance parameters, ie, parameters which are needed for the regularization but cannot be determined from data. We give examples where the method works, including a pion mass extraction with two flavours of staggered quarks at a lattice spacing of about 0.07 fm. We also give an example where the method does not work.
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Elements of Bayesian experimental design
Energy Technology Data Exchange (ETDEWEB)
Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)
1997-09-01
We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
. This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....
White, Curt M; Strazisar, Brian R; Granite, Evan J; Hoffman, James S; Pennline, Henry W
2003-06-01
The topic of global warming as a result of increased atmospheric CO2 concentration is arguably the most important environmental issue that the world faces today. It is a global problem that will need to be solved on a global level. The link between anthropogenic emissions of CO2 with increased atmospheric CO2 levels and, in turn, with increased global temperatures has been well established and accepted by the world. International organizations such as the United Nations Framework Convention on Climate Change (UNFCCC) and the Intergovernmental Panel on Climate Change (IPCC) have been formed to address this issue. Three options are being explored to stabilize atmospheric levels of greenhouse gases (GHGs) and global temperatures without severely and negatively impacting standard of living: (1) increasing energy efficiency, (2) switching to less carbon-intensive sources of energy, and (3) carbon sequestration. To be successful, all three options must be used in concert. The third option is the subject of this review. Specifically, this review will cover the capture and geologic sequestration of CO2 generated from large point sources, namely fossil-fuel-fired power gasification plants. Sequestration of CO2 in geological formations is necessary to meet the President's Global Climate Change Initiative target of an 18% reduction in GHG intensity by 2012. Further, the best strategy to stabilize the atmospheric concentration of CO2 results from a multifaceted approach where sequestration of CO2 into geological formations is combined with increased efficiency in electric power generation and utilization, increased conservation, increased use of lower carbon-intensity fuels, and increased use of nuclear energy and renewables. This review covers the separation and capture of CO2 from both flue gas and fuel gas using wet scrubbing technologies, dry regenerable sorbents, membranes, cryogenics, pressure and temperature swing adsorption, and other advanced concepts. Existing
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Using literature and data to learn Bayesian networks as clinical models of ovarian tumors
DEFF Research Database (Denmark)
Antal, P.; Fannes, G.; Timmerman, D.
2004-01-01
Thanks to its increasing availability, electronic literature has become a potential source of information for the development of complex Bayesian networks (BN), when human expertise is missing or data is scarce or contains much noise. This opportunity raises the question of how to integrate...... information from free-text resources with statistical data in learning Bayesian networks. Firstly, we report on the collection of prior information resources in the ovarian cancer domain, which includes "kernel" annotations of the domain variables. We introduce methods based on the annotations and literature...... an expert reference and against data scores (the mutual information (MI) and a Bayesian score). Next, we transform the text-based dependency measures into informative text-based priors for Bayesian network structures. Finally, we report the benefit of such informative text-based priors on the performance...
PedExpert: a computer program for the application of Bayesian networks to human paternity testing.
Gomes, R R; Campos, S V A; Pena, S D J
2009-01-01
PedExpert is a Windows-based Bayesian network software, especially constructed to solve problems in parentage testing that are complex because of missing genetic information on the alleged father and/or because they involve genetic mutations. PedExpert automates the creation and manipulation of Bayesian networks, implementing algorithms that convert pedigrees and sets of indispensable information (genotypes, allele frequencies, mutation rates) into Bayesian networks. This program has a novel feature that can incorporate information about gene mutations into tables of conditional probabilities of transmission of alleles from the alleged father to the child, without adding new nodes to the network. This permits using the same Bayesian network in different modes, for analysis of cases that include mutations or not. PedExpert is user-friendly and greatly reduces the time of analysis for complex cases of paternity testing, eliminating most sources of logical and operational error.
Bayesian Inversion of Seabed Scattering Data
2014-09-30
Bayesian Inversion of Seabed Scattering Data (Special Research Award in Ocean Acoustics) Gavin A.M.W. Steininger School of Earth & Ocean...project are to carry out joint Bayesian inversion of scattering and reflection data to estimate the in-situ seabed scattering and geoacoustic parameters...valid OMB control number. 1. REPORT DATE 30 SEP 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Bayesian
Anomaly Detection and Attribution Using Bayesian Networks
2014-06-01
UNCLASSIFIED Anomaly Detection and Attribution Using Bayesian Networks Andrew Kirk, Jonathan Legg and Edwin El-Mahassni National Security and...detection in Bayesian networks , en- abling both the detection and explanation of anomalous cases in a dataset. By exploiting the structure of a... Bayesian network , our algorithm is able to efficiently search for local maxima of data conflict between closely related vari- ables. Benchmark tests using
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...
SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE
Institute of Scientific and Technical Information of China (English)
Ming HAN; Yuanyao DING
2004-01-01
This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.
Learning dynamic Bayesian networks with mixed variables
DEFF Research Database (Denmark)
Bøttcher, Susanne Gammelgaard
This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....
Variational bayesian method of estimating variance components.
Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi
2016-07-01
We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.
Bayesian Inference for Radio Observations - Going beyond deconvolution
Lochner, Michelle; Kunz, Martin; Natarajan, Iniyan; Oozeer, Nadeem; Smirnov, Oleg; Zwart, Jon
2015-01-01
Radio interferometers suffer from the problem of missing information in their data, due to the gaps between the antennas. This results in artifacts, such as bright rings around sources, in the images obtained. Multiple deconvolution algorithms have been proposed to solve this problem and produce cleaner radio images. However, these algorithms are unable to correctly estimate uncertainties in derived scientific parameters or to always include the effects of instrumental errors. We propose an alternative technique called Bayesian Inference for Radio Observations (BIRO) which uses a Bayesian statistical framework to determine the scientific parameters and instrumental errors simultaneously directly from the raw data, without making an image. We use a simple simulation of Westerbork Synthesis Radio Telescope data including pointing errors and beam parameters as instrumental effects, to demonstrate the use of BIRO.
Applying Bayesian belief networks in rapid response situations
Energy Technology Data Exchange (ETDEWEB)
Gibson, William L [Los Alamos National Laboratory; Deborah, Leishman, A. [Los Alamos National Laboratory; Van Eeckhout, Edward [Los Alamos National Laboratory
2008-01-01
The authors have developed an enhanced Bayesian analysis tool called the Integrated Knowledge Engine (IKE) for monitoring and surveillance. The enhancements are suited for Rapid Response Situations where decisions must be made based on uncertain and incomplete evidence from many diverse and heterogeneous sources. The enhancements extend the probabilistic results of the traditional Bayesian analysis by (1) better quantifying uncertainty arising from model parameter uncertainty and uncertain evidence, (2) optimizing the collection of evidence to reach conclusions more quickly, and (3) allowing the analyst to determine the influence of the remaining evidence that cannot be obtained in the time allowed. These extended features give the analyst and decision maker a better comprehension of the adequacy of the acquired evidence and hence the quality of the hurried decisions. They also describe two example systems where the above features are highlighted.
Separating Gravitational Wave Signals from Instrument Artifacts
Littenberg, Tyson B.; Cornish, Neil J.
2010-01-01
Central to the gravitational wave detection problem is the challenge of separating features in the data produced by astrophysical sources from features produced by the detector. Matched filtering provides an optimal solution for Gaussian noise, but in practice, transient noise excursions or "glitches" complicate the analysis. Detector diagnostics and coincidence tests can be used to veto many glitches which may otherwise be misinterpreted as gravitational wave signals. The glitches that remain can lead to long tails in the matched filter search statistics and drive up the detection threshold. Here we describe a Bayesian approach that incorporates a more realistic model for the instrument noise allowing for fluctuating noise levels that vary independently across frequency bands, and deterministic "glitch fitting" using wavelets as "glitch templates", the number of which is determined by a trans-dimensional Markov chain Monte Carlo algorithm. We demonstrate the method's effectiveness on simulated data containing low amplitude gravitational wave signals from inspiraling binary black hole systems, and simulated non-stationary and non-Gaussian noise comprised of a Gaussian component with the standard LIGO/Virgo spectrum, and injected glitches of various amplitude, prevalence, and variety. Glitch fitting allows us to detect significantly weaker signals than standard techniques.
Bayesian Concordance Correlation Coefficient with Application to Repeatedly Measured Data
Directory of Open Access Journals (Sweden)
Atanu BHATTACHARJEE
2015-10-01
Full Text Available Objective: In medical research, Lin's classical concordance correlation coefficient (CCC is frequently applied to evaluate the similarity of the measurements produced by different raters or methods on the same subjects. It is particularly useful for continuous data. The objective of this paper is to propose the Bayesian counterpart to compute CCC for continuous data. Material and Methods: A total of 33 patients of astrocytoma brain treated in the Department of Radiation Oncology at Malabar Cancer Centre is enrolled in this work. It is a continuous data of tumor volume and tumor size repeatedly measured during baseline pretreatment workup and post surgery follow-ups for all patients. The tumor volume and tumor size are measured separately by MRI and CT scan. The agreement of measurement between MRI and CT scan is calculated through CCC. The statistical inference is performed through Markov Chain Monte Carlo (MCMC technique. Results: Bayesian CCC is found suitable to get prominent evidence for test statistics to explore the relation between concordance measurements. The posterior mean estimates and 95% credible interval of CCC on tumor size and tumor volume are observed with 0.96(0.87,0.99 and 0.98(0.95,0.99 respectively. Conclusion: The Bayesian inference is adopted for development of the computational algorithm. The approach illustrated in this work provides the researchers an opportunity to find out the most appropriate model for specific data and apply CCC to fulfill the desired hypothesis.
Bayesian Procedures for Identifying Aberrant Response-Time Patterns in Adaptive Testing
Linden, van der Wim J.; Guo, Fanmin
2008-01-01
In order to identify aberrant response-time patterns on educational and psychological tests, it is important to be able to separate the speed at which the test taker operates from the time the items require. A lognormal model for response times with this feature was used to derive a Bayesian procedu
Seeded Bayesian Networks: Constructing genetic networks from microarray data
Directory of Open Access Journals (Sweden)
Quackenbush John
2008-07-01
Full Text Available Abstract Background DNA microarrays and other genomics-inspired technologies provide large datasets that often include hidden patterns of correlation between genes reflecting the complex processes that underlie cellular metabolism and physiology. The challenge in analyzing large-scale expression data has been to extract biologically meaningful inferences regarding these processes – often represented as networks – in an environment where the datasets are often imperfect and biological noise can obscure the actual signal. Although many techniques have been developed in an attempt to address these issues, to date their ability to extract meaningful and predictive network relationships has been limited. Here we describe a method that draws on prior information about gene-gene interactions to infer biologically relevant pathways from microarray data. Our approach consists of using preliminary networks derived from the literature and/or protein-protein interaction data as seeds for a Bayesian network analysis of microarray results. Results Through a bootstrap analysis of gene expression data derived from a number of leukemia studies, we demonstrate that seeded Bayesian Networks have the ability to identify high-confidence gene-gene interactions which can then be validated by comparison to other sources of pathway data. Conclusion The use of network seeds greatly improves the ability of Bayesian Network analysis to learn gene interaction networks from gene expression data. We demonstrate that the use of seeds derived from the biomedical literature or high-throughput protein-protein interaction data, or the combination, provides improvement over a standard Bayesian Network analysis, allowing networks involving dynamic processes to be deduced from the static snapshots of biological systems that represent the most common source of microarray data. Software implementing these methods has been included in the widely used TM4 microarray analysis package.
DEFF Research Database (Denmark)
Reynolds, John C.
2002-01-01
expressions) for accessing and modifying shared structures, and for explicit allocation and deallocation of storage. Assertions are extended by introducing a "separating conjunction" that asserts that its sub-formulas hold for disjoint parts of the heap, and a closely related "separating implication". Coupled......, dynamically allocated arrays, and recursive procedures. We will also discuss promising future directions....
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
BEAST: Bayesian evolutionary analysis by sampling trees
Directory of Open Access Journals (Sweden)
Drummond Alexei J
2007-11-01
Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.
Thurman, Steven M; Lu, Hongjing
2014-01-01
Visual form analysis is fundamental to shape perception and likely plays a central role in perception of more complex dynamic shapes, such as moving objects or biological motion. Two primary form-based cues serve to represent the overall shape of an object: the spatial position and the orientation of locations along the boundary of the object. However, it is unclear how the visual system integrates these two sources of information in dynamic form analysis, and in particular how the brain resolves ambiguities due to sensory uncertainty and/or cue conflict. In the current study, we created animations of sparsely-sampled dynamic objects (human walkers or rotating squares) comprised of oriented Gabor patches in which orientation could either coincide or conflict with information provided by position cues. When the cues were incongruent, we found a characteristic trade-off between position and orientation information whereby position cues increasingly dominated perception as the relative uncertainty of orientation increased and vice versa. Furthermore, we found no evidence for differences in the visual processing of biological and non-biological objects, casting doubt on the claim that biological motion may be specialized in the human brain, at least in specific terms of form analysis. To explain these behavioral results quantitatively, we adopt a probabilistic template-matching model that uses Bayesian inference within local modules to estimate object shape separately from either spatial position or orientation signals. The outputs of the two modules are integrated with weights that reflect individual estimates of subjective cue reliability, and integrated over time to produce a decision about the perceived dynamics of the input data. Results of this model provided a close fit to the behavioral data, suggesting a mechanism in the human visual system that approximates rational Bayesian inference to integrate position and orientation signals in dynamic form analysis.
Bayesian frequency analysis of HD 201433 observations with BRITE
Kallinger, T
2016-01-01
Multiple oscillation frequencies separated by close to or less than the formal frequency resolution of a data set are a serious problem in the frequency analysis of time series data. We present a new and fully automated Bayesian approach that searches for close frequencies in time series data and assesses their significance by comparison to no signal and a mono-periodic signal. We extensively test the approach with synthetic data sets and apply it to the 156 days-long high-precision BRITE photometry of the SPB star HD 201433, for which we find a sequence of nine statistically significant rotationally split dipole modes.
Bayesian Query-Focused Summarization
Daumé, Hal
2009-01-01
We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian homeopathy: talking normal again.
Rutten, A L B
2007-04-01
Homeopathy has a communication problem: important homeopathic concepts are not understood by conventional colleagues. Homeopathic terminology seems to be comprehensible only after practical experience of homeopathy. The main problem lies in different handling of diagnosis. In conventional medicine diagnosis is the starting point for randomised controlled trials to determine the effect of treatment. In homeopathy diagnosis is combined with other symptoms and personal traits of the patient to guide treatment and predict response. Broadening our scope to include diagnostic as well as treatment research opens the possibility of multi factorial reasoning. Adopting Bayesian methodology opens the possibility of investigating homeopathy in everyday practice and of describing some aspects of homeopathy in conventional terms.
Bayesian LASSO, scale space and decision making in association genetics.
Directory of Open Access Journals (Sweden)
Leena Pasanen
Full Text Available LASSO is a penalized regression method that facilitates model fitting in situations where there are as many, or even more explanatory variables than observations, and only a few variables are relevant in explaining the data. We focus on the Bayesian version of LASSO and consider four problems that need special attention: (i controlling false positives, (ii multiple comparisons, (iii collinearity among explanatory variables, and (iv the choice of the tuning parameter that controls the amount of shrinkage and the sparsity of the estimates. The particular application considered is association genetics, where LASSO regression can be used to find links between chromosome locations and phenotypic traits in a biological organism. However, the proposed techniques are relevant also in other contexts where LASSO is used for variable selection.We separate the true associations from false positives using the posterior distribution of the effects (regression coefficients provided by Bayesian LASSO. We propose to solve the multiple comparisons problem by using simultaneous inference based on the joint posterior distribution of the effects. Bayesian LASSO also tends to distribute an effect among collinear variables, making detection of an association difficult. We propose to solve this problem by considering not only individual effects but also their functionals (i.e. sums and differences. Finally, whereas in Bayesian LASSO the tuning parameter is often regarded as a random variable, we adopt a scale space view and consider a whole range of fixed tuning parameters, instead. The effect estimates and the associated inference are considered for all tuning parameters in the selected range and the results are visualized with color maps that provide useful insights into data and the association problem considered. The methods are illustrated using two sets of artificial data and one real data set, all representing typical settings in association genetics.
Bayesian credible interval construction for Poisson statistics
Institute of Scientific and Technical Information of China (English)
ZHU Yong-Sheng
2008-01-01
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.
Advances in Bayesian Modeling in Educational Research
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Nonparametric Bayesian Modeling of Complex Networks
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard; Mørup, Morten
2013-01-01
Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....
Modeling Diagnostic Assessments with Bayesian Networks
Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego
2007-01-01
This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
The Bayesian Revolution Approaches Psychological Development
Shultz, Thomas R.
2007-01-01
This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…
Bayesian Network for multiple hypthesis tracking
W.P. Zajdel; B.J.A. Kröse
2002-01-01
For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ
Bayesian Monitoring of Emerging Infectious Diseases.
Directory of Open Access Journals (Sweden)
Pavel Polyakov
Full Text Available We define data analyses to monitor a change in R, the average number of secondary cases caused by a typical infected individual. The input dataset consists of incident cases partitioned into outbreaks, each initiated from a single index case. We split the input dataset into two successive subsets, to evaluate two successive R values, according to the Bayesian paradigm. We used the Bayes factor between the model with two different R values and that with a single R value to justify that the change in R is statistically significant. We validated our approach using simulated data, generated using known R. In particular, we found that claiming two distinct R values may depend significantly on the number of outbreaks. We then reanalyzed data previously studied by Jansen et al. [Jansen et al. Science 301 (5634, 804], concerning the effective reproduction number for measles in the UK, during 1995-2002. Our analyses showed that the 1995-2002 dataset should be divided into two separate subsets for the periods 1995-1998 and 1999-2002. In contrast, Jansen et al. take this splitting point as input of their analysis. Our estimated effective reproduction numbers R are in good agreement with those found by Jansen et al. In conclusion, our methodology for detecting temporal changes in R using outbreak-size data worked satisfactorily with both simulated and real-world data. The methodology may be used for updating R in real time, as surveillance outbreak data become available.
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.
BPDA - A Bayesian peptide detection algorithm for mass spectrometry
Directory of Open Access Journals (Sweden)
Braga-Neto Ulisses
2010-09-01
Full Text Available Abstract Background Mass spectrometry (MS is an essential analytical tool in proteomics. Many existing algorithms for peptide detection are based on isotope template matching and usually work at different charge states separately, making them ineffective to detect overlapping peptides and low abundance peptides. Results We present BPDA, a Bayesian approach for peptide detection in data produced by MS instruments with high enough resolution to baseline-resolve isotopic peaks, such as MALDI-TOF and LC-MS. We model the spectra as a mixture of candidate peptide signals, and the model is parameterized by MS physical properties. BPDA is based on a rigorous statistical framework and avoids problems, such as voting and ad-hoc thresholding, generally encountered in algorithms based on template matching. It systematically evaluates all possible combinations of possible peptide candidates to interpret a given spectrum, and iteratively finds the best fitting peptide signal in order to minimize the mean squared error of the inferred spectrum to the observed spectrum. In contrast to previous detection methods, BPDA performs deisotoping and deconvolution of mass spectra simultaneously, which enables better identification of weak peptide signals and produces higher sensitivities and more robust results. Unlike template-matching algorithms, BPDA can handle complex data where features overlap. Our experimental results indicate that BPDA performs well on simulated data and real MS data sets, for various resolutions and signal to noise ratios, and compares very favorably with commonly used commercial and open-source software, such as flexAnalysis, OpenMS, and Decon2LS, according to sensitivity and detection accuracy. Conclusion Unlike previous detection methods, which only employ isotopic distributions and work at each single charge state alone, BPDA takes into account the charge state distribution as well, thus lending information to better identify weak peptide
Hepatitis disease detection using Bayesian theory
Maseleno, Andino; Hidayati, Rohmah Zahroh
2017-02-01
This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.
2nd Bayesian Young Statisticians Meeting
Bitto, Angela; Kastner, Gregor; Posekany, Alexandra
2015-01-01
The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...
Skarstrom, C.
1959-03-10
A centrifugal separator is described for separating gaseous mixtures where the temperature gradients both longitudinally and radially of the centrifuge may be controlled effectively to produce a maximum separation of the process gases flowing through. Tbe invention provides for the balancing of increases and decreases in temperature in various zones of the centrifuge chamber as the result of compression and expansions respectively, of process gases and may be employed effectively both to neutralize harmful temperature gradients and to utilize beneficial temperaturc gradients within the centrifuge.
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.
Directory of Open Access Journals (Sweden)
Shu Wing Ho
2011-12-01
Full Text Available The valuation of options and many other derivative instruments requires an estimation of exante or forward looking volatility. This paper adopts a Bayesian approach to estimate stock price volatility. We find evidence that overall Bayesian volatility estimates more closely approximate the implied volatility of stocks derived from traded call and put options prices compared to historical volatility estimates sourced from IVolatility.com (“IVolatility”. Our evidence suggests use of the Bayesian approach to estimate volatility can provide a more accurate measure of ex-ante stock price volatility and will be useful in the pricing of derivative securities where the implied stock price volatility cannot be observed.
Chiu, R.; Volkamer, R. M.; Blumenstock, T.; Hase, F.; Hannigan, J. W.; Kille, N.; Frey, M.; Kumar Sha, M.; Orphal, J.
2015-12-01
Methane sources in the Colorado Front Range include biogenic sources from cattle feedlots and natural gas operations. Although numerous studies have measured methane emissions, there remains significant uncertainty regarding the relative contributions of these various methane emission sources. Here we present data from a March 2015 field campaign that deployed two Bruker EM27 Sun Fourier Transform Spectrometers (FTS) and the University of Colorado Solar Occultation Flux (CU-SOF) FTS in Eaton, Colorado; the former were used to measure enhancements in the methane vertical column densities (VCD), while the latter was used to measure ethane and ammonia VCDs. A third EM27 FTS was deployed to a background site in Westminster, Colorado which was far removed from cattle and petroleum operations. Northerly winds make possible the determination of methane VCD column enhancement from Westminster to Eaton. All instruments were compared during several background days at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. This presentation explores the potential of methane source attribution using ammonia as a tracer for feedlot emissions and ethane as a tracer for petroleum emissions.
Quantum Bayesianism at the Perimeter
Fuchs, Christopher A
2010-01-01
The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.
On Bayesian System Reliability Analysis
Energy Technology Data Exchange (ETDEWEB)
Soerensen Ringi, M.
1995-05-01
The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.
Hedging Strategies for Bayesian Optimization
Brochu, Eric; de Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.
Nonparametric Bayesian inference in biostatistics
Müller, Peter
2015-01-01
As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...
Bayesian Inference with Optimal Maps
Moselhy, Tarek A El
2011-01-01
We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selectio...