Bayesian Source Separation and Localization
Knuth, K H
1998-01-01
The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...
Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin
2013-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...
Low Complexity Bayesian Single Channel Source Separation
DEFF Research Database (Denmark)
Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole
2004-01-01
We propose a simple Bayesian model for performing single channel speech separation using factorized source priors in a sliding window linearly transformed domain. Using a one dimensional mixture of Gaussians to model each band source leads to fast tractable inference for the source signals...... can be estimated quite precisely using ML-II, but the estimation is quite sensitive to the accuracy of the priors as opposed to the source separation quality for known mixing coefficients, which is quite insensitive to the accuracy of the priors. Finally, we discuss how to improve our approach while...
Bayesian Blind Source Separation of Positive Non Stationary Sources
Ichir, Mahieddine M.; Mohammad-Djafari, Ali
2004-11-01
In this contribution, we address the problem of blind non negative source separation. This problem finds its application in many fields of data analysis. We propose herein a novel approach based on Gamma mixture probability priors: Gamma densities to constraint the unobserved sources to lie on the positive half plane; a mixture density with a first order Markov model on the associated hidden variables to account for eventual non stationarity on the sources. Posterior mean estimates are obtained via appropriate Monte Carlo Markov Chain sampling.
Convergent Bayesian formulations of blind source separation and electromagnetic source estimation
Knuth, Kevin H
2015-01-01
We consider two areas of research that have been developing in parallel over the last decade: blind source separation (BSS) and electromagnetic source estimation (ESE). BSS deals with the recovery of source signals when only mixtures of signals can be obtained from an array of detectors and the only prior knowledge consists of some information about the nature of the source signals. On the other hand, ESE utilizes knowledge of the electromagnetic forward problem to assign source signals to their respective generators, while information about the signals themselves is typically ignored. We demonstrate that these two techniques can be derived from the same starting point using the Bayesian formalism. This suggests a means by which new algorithms can be developed that utilize as much relevant information as possible. We also briefly mention some preliminary work that supports the value of integrating information used by these two techniques and review the kinds of information that may be useful in addressing the...
Bayesian Source Separation Applied to Identifying Complex Organic Molecules in Space
Knuth, Kevin H; Choinsky, Joshua; Maunu, Haley A; Carbon, Duane F
2014-01-01
Emission from a class of benzene-based molecules known as Polycyclic Aromatic Hydrocarbons (PAHs) dominates the infrared spectrum of star-forming regions. The observed emission appears to arise from the combined emission of numerous PAH species, each with its unique spectrum. Linear superposition of the PAH spectra identifies this problem as a source separation problem. It is, however, of a formidable class of source separation problems given that different PAH sources potentially number in the hundreds, even thousands, and there is only one measured spectral signal for a given astrophysical site. Fortunately, the source spectra of the PAHs are known, but the signal is also contaminated by other spectral sources. We describe our ongoing work in developing Bayesian source separation techniques relying on nested sampling in conjunction with an ON/OFF mechanism enabling simultaneous estimation of the probability that a particular PAH species is present and its contribution to the spectrum.
Convergent Bayesian formulations of blind source separation and electromagnetic source estimation
Knuth, Kevin H.; Vaughan Jr, Herbert G.
2015-01-01
We consider two areas of research that have been developing in parallel over the last decade: blind source separation (BSS) and electromagnetic source estimation (ESE). BSS deals with the recovery of source signals when only mixtures of signals can be obtained from an array of detectors and the only prior knowledge consists of some information about the nature of the source signals. On the other hand, ESE utilizes knowledge of the electromagnetic forward problem to assign source signals to th...
Bayesian kinematic earthquake source models
Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.
2009-12-01
Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high
Single channel signal component separation using Bayesian estimation
Institute of Scientific and Technical Information of China (English)
Cai Quanwei; Wei Ping; Xiao Xianci
2007-01-01
A Bayesian estimation method to separate multicomponent signals with single channel observation is presented in this paper. By using the basis function projection, the component separation becomes a problem of limited parameter estimation. Then, a Bayesian model for estimating parameters is set up. The reversible jump MCMC (Monte Carlo Markov Chain) algorithmis adopted to perform the Bayesian computation. The method can jointly estimate the parameters of each component and the component number. Simulation results demonstrate that the method has low SNR threshold and better performance.
Informed source separation: source coding meets source separation
Ozerov, Alexey; Liutkus, Antoine; Badeau, Roland; Richard, Gaël
2011-01-01
We consider the informed source separation (ISS) problem where, given the sources and the mixtures, any kind of side-information can be computed during a so-called encoding stage. This side-information is then used to assist source separation, given the mixtures only, at the so-called decoding stage. State of the art ISS approaches do not really consider ISS as a coding problem and rely on some purely source separation-inspired strategies, leading to performances that can at best reach those ...
A Bayesian approach to earthquake source studies
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also
Bayesian Kinematic Finite Fault Source Models (Invited)
Minson, S. E.; Simons, M.; Beck, J. L.
2010-12-01
Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.
A Bayesian method for microseismic source inversion
Pugh, D. J.; White, R. S.; Christie, P. A. F.
2016-08-01
Earthquake source inversion is highly dependent on location determination and velocity models. Uncertainties in both the model parameters and the observations need to be rigorously incorporated into an inversion approach. Here, we show a probabilistic Bayesian method that allows formal inclusion of the uncertainties in the moment tensor inversion. This method allows the combination of different sets of far-field observations, such as P-wave and S-wave polarities and amplitude ratios, into one inversion. Additional observations can be included by deriving a suitable likelihood function from the uncertainties. This inversion produces samples from the source posterior probability distribution, including a best-fitting solution for the source mechanism and associated probability. The inversion can be constrained to the double-couple space or allowed to explore the gamut of moment tensor solutions, allowing volumetric and other non-double-couple components. The posterior probability of the double-couple and full moment tensor source models can be evaluated from the Bayesian evidence, using samples from the likelihood distributions for the two source models, producing an estimate of whether or not a source is double-couple. Such an approach is ideally suited to microseismic studies where there are many sources of uncertainty and it is often difficult to produce reliability estimates of the source mechanism, although this can be true of many other cases. Using full-waveform synthetic seismograms, we also show the effects of noise, location, network distribution and velocity model uncertainty on the source probability density function. The noise has the largest effect on the results, especially as it can affect other parts of the event processing. This uncertainty can lead to erroneous non-double-couple source probability distributions, even when no other uncertainties exist. Although including amplitude ratios can improve the constraint on the source probability
Institute of Scientific and Technical Information of China (English)
岳秀廷; 李志农; 陈金刚
2012-01-01
用独立分量分析(ICA)分解和表示数据时,假设整个数据分布完全可以用一个坐标系来描述.然而,当观测数据是由许多自相似的、非高斯的流形组成时,则硬是用一个单独的、全局的表示是不合适的,这样会产生一个次优的表示.针对ICA在盲源分离中的不足,在变分贝叶斯理论的基础上提出了一种基于变分贝叶斯混合独立分量分析的机械故障源盲分离方法.该方法是考虑到源信号来自于多个坐标系,然后在多个坐标系下建立独立分量分析混合模型对观测信号进行学习分离.实验结果表明,本文提出的方法是非常有效的.%Decomposing and representing data using independent component analysers(ICA)assumes that the whole data distribution is adequately described by one coordinate frame.However,if the observed data consists of various self-similar,non-Gaussian manifolds, enforcing a single, global representation is not appropriate and will produce a sub-optimal representation.In order to make up the lack of independent component analyser in blind sources separations, blind separation of mechanical fault sources based on variational Bayesian mixture of independent component analysers is presented based on variational Bayesian theory in this paper.Conside.ring the source signals coming from multiple frames, the method creats a mixture model of independent component analysers in multiple frameworks for learning the observed signals and separating thenuThe experimental results show that the method proposed in this paper is very effective.
Bayesian Separation of Non-Stationary Mixtures of Dependent Gaus
National Aeronautics and Space Administration — In this work, we propose a novel approach to perform Dependent Component Analysis (DCA). DCA can be thought as the separation of latent, dependent sources from...
Convolutive Blind Source Separation Methods
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik;
2008-01-01
During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources fro...
A localization model to localize multiple sources using Bayesian inference
Dunham, Joshua Rolv
Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).
A Bayesian Approach to Detection of Small Low Emission Sources
Xun, Xiaolei; Carroll, Raymond J; Kuchment, Peter
2011-01-01
The article addresses the problem of detecting presence and location of a small low emission source inside of an object, when the background noise dominates. This problem arises, for instance, in some homeland security applications. The goal is to reach the signal-to-noise ratio (SNR) levels on the order of $10^{-3}$. A Bayesian approach to this problem is implemented in 2D. The method allows inference not only about the existence of the source, but also about its location. We derive Bayes factors for model selection and estimation of location based on Markov Chain Monte Carlo (MCMC) simulation. A simulation study shows that with sufficiently high total emission level, our method can effectively locate the source.
Nitrate source apportionment in a subtropical watershed using Bayesian model
International Nuclear Information System (INIS)
Nitrate (NO3−) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO3− concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L−1) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L−1). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L−1 NO3−. Four sources of NO3− (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl−, NO3−, HCO3−, SO42−, Ca2+, K+, Mg2+, Na+, dissolved oxygen (DO)] and dual isotope approach (δ15N–NO3− and δ18O–NO3−). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO3− to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO3−, better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds. - Highlights: • Nitrate concentration in water displayed
Fast Bayesian optimal experimental design for seismic source inversion
Long, Quan
2015-07-01
We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L
Blind source separation dependent component analysis
Xiang, Yong; Yang, Zuyuan
2015-01-01
This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and th
Audio Source Separation Using a Deep Autoencoder
Jang, Giljin; Kim, Han-Gyu; Oh, Yung-Hwan
2014-01-01
This paper proposes a novel framework for unsupervised audio source separation using a deep autoencoder. The characteristics of unknown source signals mixed in the mixed input is automatically by properly configured autoencoders implemented by a network with many layers, and separated by clustering the coefficient vectors in the code layer. By investigating the weight vectors to the final target, representation layer, the primitive components of the audio signals in the frequency domain are o...
Zhang, Le; Karakci, Ata; Korotkov, Andrei; Sutter, P M; Timbie, Peter T; Tucker, Gregory S; Wandelt, Benjamin D
2016-01-01
We present in this paper a new Bayesian semi-blind approach for foreground removal in observations of the 21-cm signal with interferometers. The technique, which we call HIEMICA (HI Expectation-Maximization Independent Component Analysis), is an extension of the Independent Component Analysis (ICA) technique developed for two-dimensional (2D) CMB maps to three-dimensional (3D) 21-cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from signal based on the diversity of their power spectra. Only relying on the statistical independence of the components, this approach can jointly estimate the 3D power spectrum of the 21-cm signal and, the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21-cm intensity mapping observations. Based on ...
Transform domain steganography with blind source separation
Jouny, Ismail
2015-05-01
This paper applies blind source separation or independent component analysis for images that may contain mixtures of text, audio, or other images for steganography purposes. The paper focuses on separating mixtures in the transform domain such as Fourier domain or the Wavelet domain. The study addresses the effectiveness of steganography when using linear mixtures of multimedia components and the ability of standard blind sources separation techniques to discern hidden multimedia messages. Mixing in the space, frequency, and wavelet (scale) domains is compared. Effectiveness is measured using mean square error rate between original and recovered images.
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and the rest of the domestic wastewater (grey water). Different characteristics of wastewater streams facilitate recovery of energy, nutrients and fresh water. To ensure agricultural or ecological reuse ...
Blind source separation theory and applications
Yu, Xianchuan; Xu, Jindong
2013-01-01
A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies The book presents an overview of Blind Source Separation, a relatively new signal processing method. Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers
Blind Source Separation: the Sparsity Revolution
Bobin, J.; Starck, Jean-Luc; Moudden, Y.; Fadili, Jalal M.
2008-01-01
Over the last few years, the development of multi-channel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-called blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have ...
Grading learning for blind source separation
Institute of Scientific and Technical Information of China (English)
张贤达; 朱孝龙; 保铮
2003-01-01
By generalizing the learning rate parameter to a learning rate matrix, this paper proposes agrading learning algorithm for blind source separation. The whole learning process is divided into threestages: initial stage, capturing stage and tracking stage. In different stages, different learning rates areused for each output component, which is determined by its dependency on other output components. Itis shown that the grading learning algorithm is equivariant and can keep the separating matrix from be-coming singular. Simulations show that the proposed algorithm can achieve faster convergence, bettersteady-state performance and higher numerical robustness, as compared with the existing algorithmsusing fixed, time-descending and adaptive learning rates.
Blind source separation problem in GPS time series
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition
Multistage decomposition algorithm for blind source separation
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
A new algorithm for blind source separation is proposed, which only extracts the single independent component at each stage. The single independent component is acquired by an iterative algorithm for searching for the optimal solution of the defined cost function. Moreover, all the independent components are obtained by systematic multistage decomposition and multistage reconstruction. When there is spatially colored noise, the performance of this algorithm is advantageous over jointly approximated diagonalization of eigen-matrices (JADE). Simulated results show that if the number of source signals is more than 25, its computational complexity is lower than that of JADE.
Validi, AbdoulAhad
2013-01-01
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector valued sep...
Directory of Open Access Journals (Sweden)
Zhujie Chu
2016-02-01
Full Text Available Municipal household solid waste (MHSW has become a serious problem in China over the course of the last two decades, resulting in significant side effects to the environment. Therefore, effective management of MHSW has attracted wide attention from both researchers and practitioners. Separate collection, the first and crucial step to solve the MHSW problem, however, has not been thoroughly studied to date. An empirical survey has been conducted among 387 households in Harbin, China in this study. We use Bayesian Belief Networks model to determine the influencing factors on separate collection. Four types of factors are identified, including political, economic, social cultural and technological based on the PEST (political, economic, social and technological analytical method. In addition, we further analyze the influential power of different factors, based on the network structure and probability changes obtained by Netica software. Results indicate that technological dimension has the greatest impact on MHSW separate collection, followed by the political dimension and economic dimension; social cultural dimension impacts MHSW the least.
Energy Technology Data Exchange (ETDEWEB)
Zhang, Le; Timbie, Peter T. [Department of Physics, University of Wisconsin, Madison, WI 53706 (United States); Bunn, Emory F. [Physics Department, University of Richmond, Richmond, VA 23173 (United States); Karakci, Ata; Korotkov, Andrei; Tucker, Gregory S. [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912 (United States); Sutter, P. M. [Center for Cosmology and Astro-Particle Physics, Ohio State University, Columbus, OH 43210 (United States); Wandelt, Benjamin D., E-mail: lzhang263@wisc.edu [Department of Physics, University of Illinois at Urbana-Champaign, 1110 W Green Street, Urbana, IL 61801 (United States)
2016-01-15
In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approach can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.
International Nuclear Information System (INIS)
We present, in this paper, a new unsupervised method for joint image super-resolution and separation between smooth and point sources. For this purpose, we propose a Bayesian approach with a Markovian model for the smooth part and Student’s t-distribution for point sources. All model and noise parameters are considered unknown and should be estimated jointly with images. However, joint estimators (joint MAP or posterior mean) are intractable and an approximation is needed. Therefore, a new gradient-like variational Bayesian method is applied to approximate the true posterior by a free-form separable distribution. A parametric form is obtained by approximating marginals but with form parameters that are mutually dependent. Their optimal values are achieved by iterating them till convergence. The method was tested by the model-generated data and a real dataset from the Herschel space observatory. (paper)
A Bayesian Approach for Localization of Acoustic Emission Source in Plate-Like Structures
Directory of Open Access Journals (Sweden)
Gang Yan
2015-01-01
Full Text Available This paper presents a Bayesian approach for localizing acoustic emission (AE source in plate-like structures with consideration of uncertainties from modeling error and measurement noise. A PZT sensor network is deployed to monitor and acquire AE wave signals released by possible damage. By using continuous wavelet transform (CWT, the time-of-flight (TOF information of the AE wave signals is extracted and measured. With a theoretical TOF model, a Bayesian parameter identification procedure is developed to obtain the AE source location and the wave velocity at a specific frequency simultaneously and meanwhile quantify their uncertainties. It is based on Bayes’ theorem that the posterior distributions of the parameters about the AE source location and the wave velocity are obtained by relating their priors and the likelihood of the measured time difference data. A Markov chain Monte Carlo (MCMC algorithm is employed to draw samples to approximate the posteriors. Also, a data fusion scheme is performed to fuse results identified at multiple frequencies to increase accuracy and reduce uncertainty of the final localization results. Experimental studies on a stiffened aluminum panel with simulated AE events by pensile lead breaks (PLBs are conducted to validate the proposed Bayesian AE source localization approach.
Blind Source Separation For Ion Mobility Spectra
International Nuclear Information System (INIS)
Miniaturization is a powerful trend for smart chemical instrumentation in a diversity of applications. It is know that miniaturization in IMS leads to a degradation of the system characteristics. For the present work, we are interested in signal processing solutions to mitigate limitations introduced by limited drift tube length that basically involve a loss of chemical selectivity. While blind source separation techniques (BSS) are popular in other domains, their application for smart chemical instrumentation is limited. However, in some conditions, basically linearity, BSS may fully recover the concentration time evolution and the pure spectra with few underlying hypothesis. This is extremely helpful in conditions where non-expected chemical interferents may appear, or unwanted perturbations may pollute the spectra. SIMPLISMA has been advocated by Harrington et al. in several papers. However, more modern methods of BSS for bilinear decomposition with the restriction of positiveness have appeared in the last decade. In order to explore and compare the performances of those methods a series of experiments were performed.
Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano
2015-04-01
A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources
Gradient Flow Convolutive Blind Source Separation
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Nielsen, Chinton Møller
2004-01-01
Experiments have shown that the performance of instantaneous gradient flow beamforming by Cauwenberghs et al. is reduced significantly in reverberant conditions. By expanding the gradient flow principle to convolutive mixtures, separation in a reverberant environment is possible. By use of a circ......Experiments have shown that the performance of instantaneous gradient flow beamforming by Cauwenberghs et al. is reduced significantly in reverberant conditions. By expanding the gradient flow principle to convolutive mixtures, separation in a reverberant environment is possible. By use...
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.
Zanini, Andrea; Woodbury, Allan D
2016-01-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
International Nuclear Information System (INIS)
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.; McCall, Jonathon D.; Prinke, Amanda M.; Webster, Jennifer B.; Seifert, Carolyn E.
2015-06-01
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search.
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
DEFF Research Database (Denmark)
Hald, Tine; Vose, D.; Wegener, Henrik Caspar;
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused...... by different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting...
DEFF Research Database (Denmark)
Oh, Geok Lian
the probability density function permits the incorporation of a priori information about the parameters, and also allow for incorporation of theoretical errors. This opens up the possibilities of application of inverse paradigm in the real-world geophysics inversion problems. In this PhD study, the Bayesian...... source. The examples show with the field data, inversion for localization is most advantageous when the forward model completely describe all the elastic wave components as is the case of the FDTD 3D elastic model. The simulation results of the inversion of the soil density values show that both...
Magnetic source separation in Earth's outer core.
Hoffman, Kenneth A; Singer, Brad S
2008-09-26
We present evidence that the source of Earth's axial dipole field is largely independent from the sources responsible for the rest of the geomagnetic field, the so-called nonaxial dipole (NAD) field. Support for this claim comes from correlations between the structure of the historic field and the behavior of the paleomagnetic field recorded in precisely dated lavas at those times when the axial dipole was especially weak or nearly absent. It is argued that a "stratification" of magnetic sources exists in the fluid core such that the axial dipole is the only observed field component that is nearly immune from the influence exerted by the lowermost mantle. It follows that subsequent work on spherical harmonic-based field descriptions may now incorporate an understanding of a dichotomy of spatial-temporal dynamo processes. PMID:18818352
Application of evidence theory in information fusion of multiple sources in bayesian analysis
Institute of Scientific and Technical Information of China (English)
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Burky, A.; Mustac, M.; Tkalcic, H.; Dreger, D. S.
2015-12-01
The Geysers geothermal region in northern California is a valuable resource for the production of geothermal electric power. Injection of water into the reservoir is necessary to maintain pressure and causes an increase in the number of earthquakes per day, but their source mechanisms are not well understood (Johnson, 2015). Previous studies of source mechanisms for events in the Geysers have identified a large number of events with significant isotropic and compensated linear vector dipole components. These source complexities most likely arise from the presence of pressurized liquids and gases, as well as temperature changes, at depth. The existence of non-double couple components in volcanic and geothermal environments has been extensively documented by previous studies, but it has also been shown that spurious components might occur due to a range of factors such as an inadequate knowledge of Earth structure and earthquake location, or noisy waveform data. Therefore, it is not entirely surprising that non-double-couple components from different source studies, each following a different experimental method and using different data types, do not agree well (e.g. Guilhem et al., 2014). The assessment of the solution robustness is critical for the physical interpretation of source mechanisms.Here, we apply a hierarchical Bayesian approach (Mustac and Tkalcic, 2015) to waveform data from M>4.5 events in the Geysers in order to produce moment tensor "solutions" and simultaneously estimate their robustness. By using a Bayesian inversion, we quantify the uncertainties from an ensemble of probable solutions instead of a single optimized solution and sample solutions at a range of centroid locations. Moreover, the hierarchical approach allows noise in the data to be sampled as a free parameter in the inversion. A rigorous approach in accounting for the data correlated noise covariance matrix prevents "over-interpretation" of noise, thus avoiding erroneous solutions. We
Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)
2012-09-15
The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)
Separating More Sources Than Sensors Using Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Belouchrani Adel
2005-01-01
Full Text Available We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs. The underlying assumption is that the original sources are disjoint in the time-frequency (TF domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.
Bayesian statistics applied to the location of the source of explosions at Stromboli Volcano, Italy
Saccorotti, G.; Chouet, B.; Martini, M.; Scarpa, R.
1998-01-01
We present a method for determining the location and spatial extent of the source of explosions at Stromboli Volcano, Italy, based on a Bayesian inversion of the slowness vector derived from frequency-slowness analyses of array data. The method searches for source locations that minimize the error between the expected and observed slowness vectors. For a given set of model parameters, the conditional probability density function of slowness vectors is approximated by a Gaussian distribution of expected errors. The method is tested with synthetics using a five-layer velocity model derived for the north flank of Stromboli and a smoothed velocity model derived from a power-law approximation of the layered structure. Application to data from Stromboli allows for a detailed examination of uncertainties in source location due to experimental errors and incomplete knowledge of the Earth model. Although the solutions are not constrained in the radial direction, excellent resolution is achieved in both transverse and depth directions. Under the assumption that the horizontal extent of the source does not exceed the crater dimension, the 90% confidence region in the estimate of the explosive source location corresponds to a small volume extending from a depth of about 100 m to a maximum depth of about 300 m beneath the active vents, with a maximum likelihood source region located in the 120- to 180-m-depth interval.
Novel blind source separation algorithm using Gaussian mixture density function
Institute of Scientific and Technical Information of China (English)
孔薇; 杨杰; 周越
2004-01-01
The blind source separation (BSS) is an important task for numerous applications in signal processing, communications and array processing. But for many complex sources blind separation algorithms are not efficient because the probability distribution of the sources cannot be estimated accurately. So in this paper, to justify the ME(maximum enteropy) approach, the relation between the ME and the MMI(minimum mutual information) is elucidated first. Then a novel algorithm that uses Gaussian mixture density to approximate the probability distribution of the sources is presented based on the ME approach. The experiment of the BSS of ship-radiated noise demonstrates that the proposed algorithm is valid and efficient.
Single channel blind source separation based on ICA feature extraction
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A new technique is proposed to solve the blind source separation (BSS) given only a single channel observation. The basis functions and the density of the coefficients of source signals learned by ICA are used as the prior knowledge. Based on the learned prior information the learning rules of single channel BSS are presented by maximizing the joint log likelihood of the mixed sources to obtain source signals from single observation,in which the posterior density of the given measurements is maximized. The experimental results exhibit a successful separation performance for mixtures of speech and music signals.
Precise Measurement of Separation Between Two Spherical Source Masses
Institute of Scientific and Technical Information of China (English)
陈德才; 罗俊; 胡忠坤; 赵亮
2004-01-01
A driving gauge method is performed to determine the separation between two spherical source masses in the measurement of Newtonian gravitational constant G. The experimental result shows that the uncertainty of determining the separation is about 0.35μm, which would contribute an uncertainty of 7.3ppm to the value of G.
Blind source separation based on generalized gaussian model
Institute of Scientific and Technical Information of China (English)
YANG Bin; KONG Wei; ZHOU Yue
2007-01-01
Since in most blind source separation (BSS) algorithms the estimations of probability density function (pdf) of sources are fixed or can only switch between one sup-Gaussian and other sub-Gaussian model,they may not be efficient to separate sources with different distributions. So to solve the problem of pdf mismatch and the separation of hybrid mixture in BSS, the generalized Gaussian model (GGM) is introduced to model the pdf of the sources since it can provide a general structure of univariate distributions. Its great advantage is that only one parameter needs to be determined in modeling the pdf of different sources, so it is less complex than Gaussian mixture model. By using maximum likelihood (ML) approach, the convergence of the proposed algorithm is improved. The computer simulations show that it is more efficient and valid than conventional methods with fixed pdf estimation.
Bayesian CMB foreground separation with a correlated log-normal model
Oppermann, Niels
2014-01-01
The extraction of foreground and CMB maps from multi-frequency observations relies mostly on the different frequency behavior of the different components. Existing Bayesian methods additionally make use of a Gaussian prior for the CMB whose correlation structure is described by an unknown angular power spectrum. We argue for the natural extension of this by using non-trivial priors also for the foreground components. Focusing on diffuse Galactic foregrounds, we propose a log-normal model including unknown spatial correlations within each component and cross-correlations between the different foreground components. We present case studies at low resolution that demonstrate the superior performance of this model when compared to an analysis with flat priors for all components.
Real-time realizations of the Bayesian Infrasonic Source Localization Method
Pinsky, V.; Arrowsmith, S.; Hofstetter, A.; Nippress, A.
2015-12-01
The Bayesian Infrasonic Source Localization method (BISL), introduced by Mordak et al. (2010) and upgraded by Marcillo et al. (2014) is destined for the accurate estimation of the atmospheric event origin at local, regional and global scales by the seismic and infrasonic networks and arrays. The BISL is based on probabilistic models of the source-station infrasonic signal propagation time, picking time and azimuth estimate merged with a prior knowledge about celerity distribution. It requires at each hypothetical source location, integration of the product of the corresponding source-station likelihood functions multiplied by a prior probability density function of celerity over the multivariate parameter space. The present BISL realization is generally time-consuming procedure based on numerical integration. The computational scheme proposed simplifies the target function so that integrals are taken exactly and are represented via standard functions. This makes the procedure much faster and realizable in real-time without practical loss of accuracy. The procedure executed as PYTHON-FORTRAN code demonstrates high performance on a set of the model and real data.
Source separation of household waste: A case study in China
International Nuclear Information System (INIS)
A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference
Separation of synchronous sources through phase locked matrix factorization.
Almeida, Miguel S B; Vigário, Ricardo; Bioucas-Dias, José
2014-10-01
In this paper, we study the separation of synchronous sources (SSS) problem, which deals with the separation of sources whose phases are synchronous. This problem cannot be addressed through independent component analysis methods because synchronous sources are statistically dependent. We present a two-step algorithm, called phase locked matrix factorization (PLMF), to perform SSS. We also show that SSS is identifiable under some assumptions and that any global minimum of PLMFs cost function is a desirable solution for SSS. We extensively study the algorithm on simulated data and conclude that it can perform SSS with various numbers of sources and sensors and with various phase lags between the sources, both in the ideal (i.e., perfectly synchronous and nonnoisy) case, and with various levels of additive noise in the observed signals and of phase jitter in the sources. PMID:25291741
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)
2013-10-15
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Bayesian Estimation of Fugitive Methane Point Source Emission Rates from a Single Downwind High-Frequency Gas Sensor With the tremendous advances in onshore oil and gas exploration and production (E&P) capability comes the realization that new tools are needed to support env...
Abban, B.; (Thanos) Papanicolaou, A. N.; Cowles, M. K.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schilling, K.; Schnoebelen, D.
2016-06-01
An enhanced revision of the Fox and Papanicolaou (hereafter referred to as "F-P") (2008a) Bayesian, Markov Chain Monte Carlo fingerprinting framework for estimating sediment source contributions and their associated uncertainties is presented. The F-P framework included two key deterministic parameters, α and β, that, respectively, reflected the spatial origin attributes of sources and the time history of eroded material delivered to and collected at the watershed outlet. However, the deterministic treatment of α and β is limited to cases with well-defined spatial partitioning of sources, high sediment delivery, and relatively short travel times with little variability in transport within the watershed. For event-based studies in intensively managed landscapes, this may be inadequate since landscape heterogeneity results in variabilities in source contributions, their pathways, delivery times, and storage within the watershed. Thus, probabilistic treatments of α and β are implemented in the enhanced framework to account for these variabilities. To evaluate the effects of the treatments of α and β on source partitioning, both frameworks are applied to the South Amana Subwatershed (SASW) in the U.S. midwest. The enhanced framework is found to estimate mean source contributions that are in good agreement with estimates from other studies in SASW. The enhanced framework is also able to produce expected trends in uncertainty during the study period, unlike the F-P framework, which does not perform as expected. Overall, the enhanced framework is found to be less sensitive to changes in α and β than the F-P framework, and, therefore, is more robust and desirable from a management standpoint.
Directory of Open Access Journals (Sweden)
Rasheda Arman Chowdhury
Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.
Prasad, Sudhakar
2014-01-01
We present an asymptotic analysis of the minimum probability of error (MPE) in inferring the correct hypothesis in a Bayesian multi-hypothesis testing (MHT) formalism using many pixels of data that are corrupted by signal dependent shot noise, sensor read noise, and background illumination. We perform this error analysis for a variety of combined noise and background statistics, including a pseudo-Gaussian distribution that can be employed to treat approximately the photon-counting statistics of signal and background as well as purely Gaussian sensor read-out noise and more general, exponentially peaked distributions. We subsequently apply the MPE asymptotics to characterize the minimum conditions needed to localize a point source in three dimensions by means of a rotating-PSF imager and compare its performance with that of a conventional imager in the presence of background and sensor-noise fluctuations. In a separate paper, we apply the formalism to the related but qualitatively different problem of 2D supe...
Common source-multiple load vs. separate source-individual load photovoltaic system
Appelbaum, Joseph
1989-01-01
A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.
Extended Nonnegative Tensor Factorisation Models for Musical Sound Source Separation
Directory of Open Access Journals (Sweden)
Derry FitzGerald
2008-01-01
Full Text Available Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.
International Nuclear Information System (INIS)
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented
Separation of core and crustal magnetic field sources
Shure, L.; Parker, R. L.; Langel, R. A.
1985-01-01
Fluid motions in the electrically conducting core and magnetized crustal rocks are the two major sources of the magnetic field observed on or slightly above the Earth's surface. The exact separation of these two contributions is not possible without imposing a priori assumptions about the internal source distribution. Nonetheless models like these were developed for hundreds of years Gauss' method, least squares analysis with a truncated spherical harmonic expansion was the method of choice for more than 100 years although he did not address separation of core and crustal sources, but rather internal versus external ones. Using some arbitrary criterion for appropriate truncation level, we now extrapolate downward core field models through the (approximately) insulating mantle. Unfortunately our view can change dramatically depending on the degree of truncation for describing core sources.
Sparsity and Morphological Diversity in Blind Source Separation
Bobin, Jérome; Starck, Jean-Luc; Fadili, Jalal M.; Moudden, Yassir
2007-01-01
Over the last few years, the development of multichannel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-caIled blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have e...
Hosseini-zad, K.; Stähler, S. C.; Sigloch, K.; Scheingraber, C.
2012-04-01
Seismic tomography has made giant progresses in the last decade. This has been due to improvements in the method, which allowed to combine the high information content of waveform modeling with the mathematically sound methods of tomographic inversion. The second factor is the vast growth of digitally available broadband seismograms. Both factors together require efficient processing schemes for seismic waveforms, which reduce the necessary manual interaction to a minimum. Since the data growth has mainly taken place on traditionally well instrumented regions, many areas are still sparsely instrumented, so the processing scheme should treat all data with highest care. Our processing scheme "No data left behind", which is implemented in Python and incorporated into the seismology package ObsPy automates the steps for global or regional body wave tomography: 1. Data retrieval: Downloading of event-based seismic waveforms from ORFEUS and IRIS. This way around 1600 stations globally are available. Data from other sources can be added manually. 2. Preprocessing: Deconvolution of instrument responses, recognition of bad recordings and automated correction, if possible. No rejection is done in this stage. 3. Cutting of time windows around body wave phases, decomposition of the signals into 6 frequency bands (20s to 1 Hz), individual determination of SNR and similarity to synthetic waveforms. 4. Rejection of bad windows. Since the rejection is done based on SNR or CC with synthetics independently for each of the 6 frequency bands, even very noisy stations like ocean islands are not discarded completely. 5. Bayesian Source inversion: The source parameters including depth, CMT and Source Time Function are determined in a probabilistic way using a wavelet base and P- and SH-waveforms. The whole algorithm is modular and additional modules (e.g. for OBS preprocessing) can be selected individually.
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Directory of Open Access Journals (Sweden)
Aïssa-El-Bey Abdeldjalil
2007-01-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Directory of Open Access Journals (Sweden)
Abdeldjalil Aïssa-El-Bey
2007-03-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
How Many Separable Sources? Model Selection In Independent Components Analysis
DEFF Research Database (Denmark)
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis...... computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher’s iris data set and Howells’ craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....
Jutten, Christian; Karhunen, Juha
2004-10-01
In this paper, we review recent advances in blind source separation (BSS) and independent component analysis (ICA) for nonlinear mixing models. After a general introduction to BSS and ICA, we discuss in more detail uniqueness and separability issues, presenting some new results. A fundamental difficulty in the nonlinear BSS problem and even more so in the nonlinear ICA problem is that they provide non-unique solutions without extra constraints, which are often implemented by using a suitable regularization. In this paper, we explore two possible approaches. The first one is based on structural constraints. Especially, post-nonlinear mixtures are an important special case, where a nonlinearity is applied to linear mixtures. For such mixtures, the ambiguities are essentially the same as for the linear ICA or BSS problems. The second approach uses Bayesian inference methods for estimating the best statistical parameters, under almost unconstrained models in which priors can be easily added. In the later part of this paper, various separation techniques proposed for post-nonlinear mixtures and general nonlinear mixtures are reviewed. PMID:15593377
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed
FREQUENCY OVERLAPPED SIGNAL IDENTIFICATION USING BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
WANG Junfeng; SHI Tielin; HE Lingsong; YANG Shuzi
2006-01-01
The concepts, principles and usages of principal component analysis (PCA) and independent component analysis (ICA) are interpreted. Then the algorithm and methodology of ICA-based blind source separation (BSS), in which the pre-whitened based on PCA for observed signals is used, are researched. Aiming at the mixture signals, whose frequency components are overlapped by each other, a simulation of BSS to separate this type of mixture signals by using theory and approach of BSS has been done. The result shows that the BSS has some advantages what the traditional methodology of frequency analysis has not.
Source separation and clustering of phase-locked subspaces.
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José Mario; Vigário, Ricardo
2011-09-01
It has been proven that there are synchrony (or phase-locking) phenomena present in multiple oscillating systems such as electrical circuits, lasers, chemical reactions, and human neurons. If the measurements of these systems cannot detect the individual oscillators but rather a superposition of them, as in brain electrophysiological signals (electro- and magneoencephalogram), spurious phase locking will be detected. Current source-extraction techniques attempt to undo this superposition by assuming properties on the data, which are not valid when underlying sources are phase-locked. Statistical independence of the sources is one such invalid assumption, as phase-locked sources are dependent. In this paper, we introduce methods for source separation and clustering which make adequate assumptions for data where synchrony is present, and show with simulated data that they perform well even in cases where independent component analysis and other well-known source-separation methods fail. The results in this paper provide a proof of concept that synchrony-based techniques are useful for low-noise applications. PMID:21791409
A source separation approach to enhancing marine mammal vocalizations.
Gur, M Berke; Niezrecki, Christopher
2009-12-01
A common problem in passive acoustic based marine mammal monitoring is the contamination of vocalizations by a noise source, such as a surface vessel. The conventional approach in improving the vocalization signal to noise ratio (SNR) is to suppress the unwanted noise sources by beamforming the measurements made using an array. In this paper, an alternative approach to multi-channel underwater signal enhancement is proposed. Specifically, a blind source separation algorithm that extracts the vocalization signal from two-channel noisy measurements is derived and implemented. The proposed algorithm uses a robust decorrelation criterion to separate the vocalization from background noise, and hence is suitable for low SNR measurements. To overcome the convergence limitations resulting from temporally correlated recordings, the supervised affine projection filter update rule is adapted to the unsupervised source separation framework. The proposed method is evaluated using real West Indian manatee (Trichechus manatus latirostris) vocalizations and watercraft emitted noise measurements made within a typical manatee habitat in Florida. The results suggest that the proposed algorithm can improve the detection range of a passive acoustic detector five times on average (for input SNR between -10 and 5 dB) using only two receivers. PMID:20000920
Jank, Anna; Müller, Wolfgang; Schneider, Irene; Gerke, Frederic; Bockreis, Anke
2015-05-01
An efficient biological treatment of source separated organic waste from household kitchens and gardens (biowaste) requires an adequate upfront mechanical preparation which possibly includes a hand sorting for the separation of contaminants. In this work untreated biowaste from households and gardens and the screen overflow >60mm of the same waste were mechanically treated by a Waste Separation Press (WSP). The WSP separates the waste into a wet fraction for biological treatment and a fraction of dry contaminants for incineration. The results show that it is possible to replace a hand sorting of contaminants, the milling and a screening of organic waste before the biological treatment by using the WSP. A special focus was put on the contaminants separation. The separation of plastic film from the untreated biowaste was 67% and the separation rate of glass was about 92%. About 90% of the organics were transferred to the fraction for further biological treatment. When treating the screen overflow >60mm with the WSP 86% of the plastic film and 88% of the glass were transferred to the contaminants fraction. 32% of the organic was transferred to the contaminants fraction and thereby lost for a further biological treatment. Additionally it was calculated that national standards for glass contaminants in compost can be met when using the WSP to mechanically treat the total biowaste. The loss of biogas by transferring biodegradable organics to the contaminants fraction was about 11% when preparing the untreated biowaste with the WSP. PMID:25761398
Ransom, Katherine M.; Grote, Mark N.; Deinhart, Amanda; Eppich, Gary; Kendall, Carol; Sanborn, Matthew E.; Souders, A. Kate; Wimpenny, Joshua; Yin, Qing-zhu; Young, Megan; Harter, Thomas
2016-07-01
Groundwater quality is a concern in alluvial aquifers that underlie agricultural areas, such as in the San Joaquin Valley of California. Shallow domestic wells (less than 150 m deep) in agricultural areas are often contaminated by nitrate. Agricultural and rural nitrate sources include dairy manure, synthetic fertilizers, and septic waste. Knowledge of the relative proportion that each of these sources contributes to nitrate concentration in individual wells can aid future regulatory and land management decisions. We show that nitrogen and oxygen isotopes of nitrate, boron isotopes, and iodine concentrations are a useful, novel combination of groundwater tracers to differentiate between manure, fertilizers, septic waste, and natural sources of nitrate. Furthermore, in this work, we develop a new Bayesian mixing model in which these isotopic and elemental tracers were used to estimate the probability distribution of the fractional contributions of manure, fertilizers, septic waste, and natural sources to the nitrate concentration found in an individual well. The approach was applied to 56 nitrate-impacted private domestic wells located in the San Joaquin Valley. Model analysis found that some domestic wells were clearly dominated by the manure source and suggests evidence for majority contributions from either the septic or fertilizer source for other wells. But, predictions of fractional contributions for septic and fertilizer sources were often of similar magnitude, perhaps because modeled uncertainty about the fraction of each was large. For validation of the Bayesian model, fractional estimates were compared to surrounding land use and estimated source contributions were broadly consistent with nearby land use types.
An autonomous surveillance system for blind sources localization and separation
Wu, Sean; Kulkarni, Raghavendra; Duraiswamy, Srikanth
2013-05-01
This paper aims at developing a new technology that will enable one to conduct an autonomous and silent surveillance to monitor sound sources stationary or moving in 3D space and a blind separation of target acoustic signals. The underlying principle of this technology is a hybrid approach that uses: 1) passive sonic detection and ranging method that consists of iterative triangulation and redundant checking to locate the Cartesian coordinates of arbitrary sound sources in 3D space, 2) advanced signal processing to sanitizing the measured data and enhance signal to noise ratio, and 3) short-time source localization and separation to extract the target acoustic signals from the directly measured mixed ones. A prototype based on this technology has been developed and its hardware includes six B and K 1/4-in condenser microphones, Type 4935, two 4-channel data acquisition units, Type NI-9234, with a maximum sampling rate of 51.2kS/s per channel, one NI-cDAQ 9174 chassis, a thermometer to measure the air temperature, a camera to view the relative positions of located sources, and a laptop to control data acquisition and post processing. Test results for locating arbitrary sound sources emitting continuous, random, impulsive, and transient signals, and blind separation of signals in various non-ideal environments is presented. This system is invisible to any anti-surveillance device since it uses the acoustic signal emitted by a target source. It can be mounted on a robot or an unmanned vehicle to perform various covert operations, including intelligence gathering in an open or a confined field, or to carry out the rescue mission to search people trapped inside ruins or buried under wreckages.
Source Separation with One Ear: Proposition for an Anthropomorphic Approach
Directory of Open Access Journals (Sweden)
Ramin Pichevar
2005-06-01
Full Text Available We present an example of an anthropomorphic approach, in which auditory-based cues are combined with temporal correlation to implement a source separation system. The auditory features are based on spectral amplitude modulation and energy information obtained through 256 cochlear filters. Segmentation and binding of auditory objects are performed with a two-layered spiking neural network. The first layer performs the segmentation of the auditory images into objects, while the second layer binds the auditory objects belonging to the same source. The binding is further used to generate a mask (binary gain to suppress the undesired sources from the original signal. Results are presented for a double-voiced (2 speakers speech segment and for sentences corrupted with different noise sources. Comparative results are also given using PESQ (perceptual evaluation of speech quality scores. The spiking neural network is fully adaptive and unsupervised.
Blind Source Separation with Compressively Sensed Linear Mixtures
Kleinsteuber, Martin
2011-01-01
This work studies the problem of simultaneously separating and reconstructing signals from compressively sensed linear mixtures. We assume that all source signals share a common sparse representation basis. The approach combines classical Compressive Sensing (CS) theory with a linear mixing model. It allows the mixtures to be sampled independently of each other. If samples are acquired in the time domain, this means that the sensors need not be synchronized. Since Blind Source Separation (BSS) from a linear mixture is only possible up to permutation and scaling, factoring out these ambiguities leads to a minimization problem on the so-called oblique manifold. We develop a geometric conjugate subgradient method that scales to large systems for solving the problem. Numerical results demonstrate the promising performance of the proposed algorithm compared to several state of the art methods.
Blind source separation advances in theory, algorithms and applications
Wang, Wenwu
2014-01-01
Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
International Nuclear Information System (INIS)
Acoustic emission (AE) is a well-established nondestructive testing method for assessing the condition of liquid-filled tanks. Often the tank can be tested without the need for accurate location of AE sources. But sometimes, accurate location is required, such as in the case of follow-up inspections after AE has indicated a significant defect. Traditional computed location techniques that considered only the wave traveling through the shell of the tank have not proved reliable when applied to liquid-filled tanks. This because AE sensors are often responding to liquid-borne waves, that are not considered in the traditional algorithms. This paper describes an approach for locating AE sources on the wall of liquid filled tanks that includes two novel aspects: (i) the use of liquid-borne waves, and (ii) the use of a probabilistic algorithm. The proposed algorithm is developed within a Bayesian framework that considers uncertainties in the wave velocities and the time of arrival. A Markov Chain Monte Carlo is used to estimate the distribution of the AE source location. This approach was applied on a 102 inch diameter (29 000 gal) railroad tank car by estimating the source locations from pencil lead break with waveforms recorded. Results show that the proposed Bayesian approach for source location can be used to calculate the most probable region of the tank wall where the AE source is located. (paper)
Ransom, K.; Grote, M. N.; Deinhart, A.; Eppich, G.; Kendall, C.; Sanborn, M.; Souders, K.; Wimpenny, J.; Yin, Q. Z.; Young, M. B.; Harter, T.
2015-12-01
Groundwater quality is a concern in alluvial aquifers that underlie agricultural areas, such as in the San Joaquin Valley of California. Nitrate from fertilizers and animal waste can leach to groundwater and contaminate drinking water resources. Dairy manure and synthetic fertilizers are prevailing sources of nitrate in groundwater for the San Joaquin Valley with septic waste contributing as a major source in some areas. The rural population in the San Joaquin Valley relies almost exclusively on shallow domestic wells (less than 150 m deep), of which many have been affected by nitrate. Knowledge of the proportion of each of the three main nitrate sources (manure, synthetic fertilizer, and septic waste) contributing to individual well nitrate can aid future regulatory decisions. Mixing models quantify the proportional contributions of sources to a mixture by using the concentration of conservative tracers within each source as a source signature. Deterministic mixing models are common, but do not allow for variability in the tracer source concentration or overlap of tracer concentrations between sources. In contrast, Bayesian mixing models treat source contributions probabilistically, building statistical variation into the inferences for each well. The authors developed a Bayesian mixing model on a pilot network of 56 private domestic wells in the San Joaquin Valley for which nitrogen, oxygen, and boron isotopes as well as nitrate and iodine were measured. Nitrogen, oxygen, and boron isotopes as well as iodine can be used as tracers to differentiate between manure, fertilizers, septic waste, and natural sources of nitrate (which can contribute nitrate in concentrations up to 4 mg/L). In this work, the isotopic and elemental tracers were used to estimate the proportional contribution of manure, fertilizers, septic waste, and natural sources to overall groundwater nitrate concentration in individual wells. Prior distributions for the four tracers for each of the four
The Leuven isotope separator on-line laser ion source
Kudryavtsev, Y; Franchoo, S; Huyse, M; Gentens, J; Kruglov, K; Müller, W F; Prasad, N V S; Raabe, R; Reusen, I; Van den Bergh, P; Van Duppen, P; Van Roosbroeck, J; Vermeeren, L; Weissman, L
2002-01-01
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. sup 5 sup 4 sup , sup 5 sup 5 Ni and sup 5 sup 4 sup , sup 5 sup 5 Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of sup 2 sup 3 sup 8 U. The b decay of the sup 6 sup 8 sup - sup 7 sup 4 Ni, sup 6 sup 7 sup - sup 7 sup 0 Co, sup 7 sup 0 sup - sup 7 sup 5 Cu and sup 1 sup 1 sup 0 sup - sup 1 sup 1 sup 4 Rh isotopes has been studied by means of beta-gamma and gamma-gamma spectroscopy. Recently, the laser ion source has been used to produce neutron-d...
DEFF Research Database (Denmark)
Pires, Sara Monteiro; Hald, Tine
2010-01-01
. These differences presumably represent multiple factors, such as differences in survivability through the food chain and/or pathogenicity. The relative importance of the source-dependent factors varied considerably over the years, reflecting, among others, variability in the surveillance programs for the different......Salmonella is a major cause of human gastroenteritis worldwide. To prioritize interventions and assess the effectiveness of efforts to reduce illness, it is important to attribute salmonellosis to the responsible sources. Studies have suggested that some Salmonella subtypes have a higher health...... impact than others. Likewise, some food sources appear to have a higher impact than others. Knowledge of variability in the impact of subtypes and sources may provide valuable added information for research, risk management, and public health strategies. We developed a Bayesian model that attributes...
DEFF Research Database (Denmark)
Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John;
of both concentration and groundwater flow. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across...... a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners...... compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics, ii) measurement uncertainty, and iii...
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...
Exploiting Narrowband Efficiency for Broadband Convolutive Blind Source Separation
Directory of Open Access Journals (Sweden)
Aichner Robert
2007-01-01
Full Text Available Based on a recently presented generic broadband blind source separation (BSS algorithm for convolutive mixtures, we propose in this paper a novel algorithm combining advantages of broadband algorithms with the computational efficiency of narrowband techniques. By selective application of the Szegö theorem which relates properties of Toeplitz and circulant matrices, a new normalization is derived as a special case of the generic broadband algorithm. This results in a computationally efficient and fast converging algorithm without introducing typical narrowband problems such as the internal permutation problem or circularity effects. Moreover, a novel regularization method for the generic broadband algorithm is proposed and subsequently also derived for the proposed algorithm. Experimental results in realistic acoustic environments show improved performance of the novel algorithm compared to previous approximations.
A multilevel voltage-source inverter with separate dc sources for static var generation
Energy Technology Data Exchange (ETDEWEB)
Peng, Fang Zheng [Tennessee Univ., Knoxville, TN (United States)]|[Oak Ridge National Lab., TN (United States); Lai, Jih-Sheng; McKeever, J.; VanCoevering, J. [Oak Ridge National Lab., TN (United States)
1995-09-01
A new multilevel voltage-source inverter with a separate dc sources is proposed for high-voltage, high-power applications, such as flexible ac transmission systems (FACTS) including static var generation (SVG), power line conditioning, series compensation, phase shifting, voltage balancing, fuel cell and photovoltaic utility systems interfacing, etc. The new M-level inverter consists of (M-1)/2 single phase full bridges in which each bridge has its own separate dc source. This inverter can generate almost sinusoidal waveform voltage with only one time switching per cycle as the number of levels increases. It can solve the problems of conventional transformer-based multipulse inverters and the problems of the multilevel diode-clamped inverter and the multilevel flying capacitor inverter. To demonstrate the superiority of the new inverter, a SVG system using the new inverter topology is discussed through analysis, simulation and experiment.
Blind source separation with unknown and dynamically changing number of source signals
Institute of Scientific and Technical Information of China (English)
YE Jimin; ZHANG Xianda; ZHU Xiaolong
2006-01-01
The contrast function remains to be an open problem in blind source separation (BSS) when the number of source signals is unknown and/or dynamically changed.The paper studies this problem and proves that the mutual information is still the contrast function for BSS if the mixing matrix is of full column rank. The mutual information reaches its minimum at the separation points, where the random outputs of the BSS system are the scaled and permuted source signals, while the others are zero outputs. Using the property that the transpose of the mixing matrix and a matrix composed by m observed signals have the indentical null space with probability one, a practical method, which can detect the unknown number of source signals n, ulteriorly traces the dynamical change of the sources number with a few of data, is proposed. The effectiveness of the proposed theorey and the developed novel algorithm is verified by adaptive BSS simulations with unknown and dynamically changing number of source signals.
Decentralized modal identification using sparse blind source separation
International Nuclear Information System (INIS)
Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time–frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure
A DIGITAL COLOR IMAGE WATERMARKING SYSTEM USING BLIND SOURCE SEPARATION
Directory of Open Access Journals (Sweden)
Sangeeta D. Jadhav
2013-12-01
Full Text Available An attempt is made to implement a digital color image-adaptive watermarking scheme in spatial domain and hybrid domain i.e host image in wavelet domain and watermark in spatial domain. Blind Source Separation (BSS is used to extract the watermark The novelty of the presented scheme lies in determining the mixing matrix for BSS model using BFGS (Broyden– Fletcher–Goldfarb–Shanno optimization technique. This method is based on the smooth and textured portions of the image. Texture analysis is carried based on energy content of the image (using GLCM which makes the method image adaptive to embed color watermark. The performance evaluation is carried for hybrid domain of various color spaces like YIQ, HSI and YCbCr and the feasibility of optimization algorithm for finding mixing matrix is also checked for these color spaces. Three ICA (Independent Component Analysis/BSS algorithms are used in extraction procedure ,through which the watermark can be retrieved efficiently . An effort is taken to find out the best suited color space to embed the watermark which satisfies the condition of imperceptibility and robustness against various attacks.
Bayesian Integration of Isotope Ratios for Geographic Sourcing of Castor Beans
Energy Technology Data Exchange (ETDEWEB)
Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Hart, Garret L.; Ehleringer, James; West, Jason B.; Gill, Gary A.; Duckworth, Douglas C.
2012-08-15
Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 6 0 . 9 {+-} 2 . 1 % versus 5 5 . 9 {+-} 2 . 1 % and 4 0 . 2 {+-} 1 . 8 % for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Directory of Open Access Journals (Sweden)
Chris von Borgstede
2012-06-01
Full Text Available Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS. In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to employees at different workplaces. The results show that respondents with awareness of EMS report higher levels of source separation at work, stronger environmental concern, personal and social norms, and perceive source separation to be less difficult. Furthermore, the results support the notion that after the adoption of EMS at the workplace, source separation at work spills over into source separation in the household. The potential implications for environmental management systems are discussed.
Using the FASST source separation toolbox for noise robust speech recognition
Ozerov, Alexey; Vincent, Emmanuel
2011-01-01
We describe our submission to the 2011 CHiME Speech Separation and Recognition Challenge. Our speech separation algorithm was built using the Flexible Audio Source Separation Toolbox (FASST) we developed recently. This toolbox is an implementation of a general flexible framework based on a library of structured source models that enable the incorporation of prior knowledge about a source separation problem via user-specifiable constraints. We show how to use FASST to develop an efficient spee...
Energy Technology Data Exchange (ETDEWEB)
Keats, A.; Lien, F.S. [Waterloo Univ., ON (Canada). Dept. of Mechanical Engineering; Yee, E. [Defence Research and Development Canada, Medicine Hat, AB (Canada)
2006-07-01
A Bayesian probabilistic inferential framework capable of incorporating errors and prior information was presented. Bayesian inference was used to find the posterior probability density function of the source parameters in a set of concentration measurements. A method of calculating the source-receptor relationship required for the determination of direct probability was provided which used the adjoint of the transport equation for the scalar concentration. The posterior distribution of the source parameters was sampled using a Markov chain Monte Carlo method. The inverse source determination method was validated against real data sets obtained from a highly disturbed, complex flow field in an urban environment. Data sets included a water-channel simulation of near-field dispersion of contaminant plumes in a large array of building-like obstacles, and a full-scale experiment in Oklahoma City. It was concluded that the 2 examples validated the proposed approach for inverse source determination.
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2011-01-01
We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical...... models. Analysis of simulated and real EEG data provide evidence that reconstruction of the forward model leads to improved source estimates....
Towards Enhanced Underwater Lidar Detection via Source Separation
Illig, David W.
Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical
Wood ash as a magnesium source for phosphorus recovery from source-separated urine.
Sakthivel, S Ramesh; Tilley, Elizabeth; Udert, Kai M
2012-03-01
Struvite precipitation is a simple technology for phosphorus recovery from source-separated urine. However, production costs can be high if expensive magnesium salts are used as precipitants. Therefore, waste products can be interesting alternatives to industrially-produced magnesium salts. We investigated the technical and financial feasibility of wood ash as a magnesium source in India. In batch experiments with source-separated urine, we could precipitate 99% of the phosphate with a magnesium dosage of 2.7 mol Mg mol P(-1). The availability of the magnesium from the wood ash used in our experiment was only about 50% but this could be increased by burning the wood at temperatures well above 600 °C. Depending on the wood ash used, the precipitate can contain high concentrations of heavy metals. This could be problematic if the precipitate were used as fertilizer depending on the applicable fertilizer regulations. The financial study revealed that wood ash is considerably cheaper than industrially-produced magnesium sources and even cheaper than bittern. However, the solid precipitated with wood ash is not pure struvite. Due to the high calcite and the low phosphorus content (3%), the precipitate would be better used as a phosphorus-enhanced conditioner for acidic soils. The estimated fertilizer value of the precipitate was actually slightly lower than wood ash, because 60% of the potassium dissolved into solution during precipitation and was not present in the final product. From a financial point of view and due to the high heavy metal content, wood ash is not a very suitable precipitant for struvite production. Phosphate precipitation from urine with wood ash can be useful if (1) a strong need for a soil conditioner that also contains phosphate exists, (2) potassium is abundant in the soil and (3) no other cheap precipitant, such as bittern or magnesium oxide, is available. PMID:22297249
Gaspar, Leticia; Owens, Philip; Petticrew, Ellen; Lobb, David; Koiter, Alexander; Reiffarth, Dominic; Barthod, Louise; Liu, Kui; Martinez-Carreras, Nuria
2015-04-01
An understanding of sediment redistribution processes and the main sediment sources within a watershed is needed to support catchment management strategies, to control soil erosion processes, and to preserve water quality and ecological status. The fingerprinting technique is increasingly recognised as a method for establishing the source of the sediment transported within a catchment. However, the different behaviour of the various fingerprinting properties has been recognised as a major limitation of the technique, and the uncertainty associated with tracer selection has to be addressed. Do the different properties give similar results? Can we combine different groups of tracers? This study aims to compare and evaluate the differences between fingerprinting predictions provided by a Bayesian mixing model using different groups of tracer properties for use in sediment source identification. We are employing fallout radionuclides (137Cs, 210Pbex) and geochemical elements as conventional fingerprinting properties, and colour parameters and compound-specific stable isotopes (CSSIs) as emerging properties; both alone and in combination. These fingerprinting properties are being used to determine the proportional contributions of fine sediment in the South Tobacco Creek Watershed, an agricultural catchment located in south-central Manitoba in Canada. We present preliminary results to evaluate the use of different statistical procedures to increase the accuracy of fingerprinting outputs and establish protocols for the selection of appropriate fingerprint properties.
Residents’ Household Solid Waste (HSW Source Separation Activity: A Case Study of Suzhou, China
Directory of Open Access Journals (Sweden)
Hua Zhang
2014-09-01
Full Text Available Though the Suzhou government has provided household solid waste (HSW source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residents’ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% of the respondents in Suzhou thought they knew how to source separate HSW, and 29% of them have source separated HSW accurately. The results also found that the current HSW source separation pilot program in Suzhou is valid, as HSW source separation facilities and residents’ separation behavior both became better and better along with the program implementation. The main determinants of residents’ HSW source separation behavior are residents’ age, HSW source separation facilities and government preferential policies. The accessibility to waste management service is particularly important. Attitudes and willingness do not have significant impacts on residents’ HSW source separation behavior.
A Bayesian Approach to Discovering Truth from Conflicting Sources for Data Integration
Zhao, Bo; Gemmell, Jim; Han, Jiawei
2012-01-01
In practical data integration systems, it is common for the data sources being integrated to provide conflicting information about the same entity. Consequently, a major challenge for data integration is to derive the most complete and accurate integrated records from diverse and sometimes conflicting sources. We term this challenge the truth finding problem. We observe that some sources are generally more reliable than others, and therefore a good model of source quality is the key to solving the truth finding problem. In this work, we propose a probabilistic graphical model that can automatically infer true records and source quality without any supervision. In contrast to previous methods, our principled approach leverages a generative process of two types of errors (false positive and false negative) by modeling two different aspects of source quality. In so doing, ours is also the first approach designed to merge multi-valued attribute types. Our method is scalable, due to an efficient sampling-based inf...
2014-01-01
Time series studies have suggested that air pollution can negatively impact health. These studies have typically focused on the total mass of fine particulate matter air pollution or the individual chemical constituents that contribute to it, and not source-specific contributions to air pollution. Source-specific contribution estimates are useful from a regulatory standpoint by allowing regulators to focus limited resources on reducing emissions from sources that are major cont...
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Chris von Borgstede; Maria Andersson; Ola Eriksson
2012-01-01
Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS). In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to empl...
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole;
2009-01-01
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....
Directory of Open Access Journals (Sweden)
Lin Wang
2010-01-01
Full Text Available Frequency-domain blind source separation (BSS performs poorly in high reverberation because the independence assumption collapses at each frequency bins when the number of bins increases. To improve the separation result, this paper proposes a method which combines two techniques by using beamforming as a preprocessor of blind source separation. With the sound source locations supposed to be known, the mixed signals are dereverberated and enhanced by beamforming; then the beamformed signals are further separated by blind source separation. To implement the proposed method, a superdirective fixed beamformer is designed for beamforming, and an interfrequency dependence-based permutation alignment scheme is presented for frequency-domain blind source separation. With beamforming shortening mixing filters and reducing noise before blind source separation, the combined method works better in reverberation. The performance of the proposed method is investigated by separating up to 4 sources in different environments with reverberation time from 100 ms to 700 ms. Simulation results verify the outperformance of the proposed method over using beamforming or blind source separation alone. Analysis demonstrates that the proposed method is computationally efficient and appropriate for real-time processing.
Single-channel source separation using non-negative matrix factorization
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard
, in which a number of methods for single-channel source separation based on non-negative matrix factorization are presented. In the papers, the methods are applied to separating audio signals such as speech and musical instruments and separating different types of tissue in chemical shift imaging....
30 CFR 57.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... NONMETAL MINES Explosives Electric Blasting-Surface and Underground § 57.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall...
30 CFR 56.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... MINES Explosives Electric Blasting § 56.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be locked in the open position...
TCA Cycle Turnover And Serum Glucose Sources By Automated Bayesian Analysis Of NMR Spectra
Merritt, Matthew E.; Burgess, Shawn; Jeffrey, F. Mark; Sherry, A. Dean; Malloy, Craig; Bretthorst, G. Larry
2004-04-01
Changes in sources of serum glucose are indicative of a variety of pathological metabolic states. It is possible to measure the sources of serum glucose by the administration of deuterated water to a subject followed by analysis of the 2H enrichment levels in glucose extracted from plasma from a single blood draw by 2H NMR. Markov Chain Monte Carlo simulations of the posterior probability densities may then be used to evaluate the contribution of glycogenolysis, glycerol, and the Kreb's cycle to serum glucose. Experiments with simulated NMR spectra show that in spectra with a S/N of 20 to 1, the resulting metabolic information may be evaluated with an accuracy of about 4 percent.
A LASER ION-SOURCE FOR ONLINE MASS SEPARATION
VANDUPPEN, P; DENDOOVEN, P; HUYSE, M; VERMEEREN, L; QAMHIEH, ZN; SILVERANS, RE; VANDEWEERT, E
1992-01-01
A laser ion source based on resonance photo ionization in a gas cell is proposed. The gas cell, filled with helium, consists of a target chamber in which the recoil products are stopped and neutralized, and an ionization chamber where the atoms of interest are selectively ionized by the laser light.
Procedure for Separating Noise Sources in Measurements of Turbofan Engine Core Noise
Miles, Jeffrey Hilton
2006-01-01
The study of core noise from turbofan engines has become more important as noise from other sources like the fan and jet have been reduced. A multiple microphone and acoustic source modeling method to separate correlated and uncorrelated sources has been developed. The auto and cross spectrum in the frequency range below 1000 Hz is fitted with a noise propagation model based on a source couplet consisting of a single incoherent source with a single coherent source or a source triplet consisting of a single incoherent source with two coherent point sources. Examples are presented using data from a Pratt & Whitney PW4098 turbofan engine. The method works well.
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan;
2005-01-01
A limitation in many source separation tasks is that the number of source signals has to be known in advance. Further, in order to achieve good performance, the number of sources cannot exceed the number of sensors. In many real-world applications these limitations are too strict. We propose a...... novel method for over-complete blind source separation. Two powerful source separation techniques have been combined, independent component analysis and binary time-frequency masking. Hereby, it is possible to iteratively extract each speech signal from the mixture. By using merely two microphones we...... can separate up to six mixed speech signals under anechoic conditions. The number of source signals is not assumed to be known in advance. It is also possible to maintain the extracted signals as stereo signals...
Liu, Xianhua; Randall, R. B.
2005-11-01
Internal combustion engines have several vibration sources, such as combustion, fuel injection, piston slap and valve operation. For machine condition monitoring or design improvement purposes, it is necessary to separate the vibration signals caused by different sources and then analyse each of them individually. However, traditional frequency analysis techniques are not very useful due to overlap of the different sources over a wide frequency range. This paper attempts to separate the vibration sources, especially piston slap, by using blind source separation techniques with the intention of revealing the potential of the new technique for solving mechanical vibration problems. The BSS method and the Blind least mean square algorithm using Gray's variable norm as a measure of non-Gaussianity of the sources is briefly described and separation results for both simulated and measured data are presented and discussed.
Quantum Rate Distortion, Reverse Shannon Theorems, and Source-Channel Separation
Datta, Nilanjana; Hsieh, Min-Hsiu; Wilde, Mark M.
2013-01-01
We derive quantum counterparts of two key theorems of classical information theory, namely, the rate distortion theorem and the source-channel separation theorem. The rate-distortion theorem gives the ultimate limits on lossy data compression, and the source-channel separation theorem implies that a two-stage protocol consisting of compression and channel coding is optimal for transmitting a memoryless source over a memoryless channel. In spite of their importance in the classical domain, the...
Student support and perceptions of urine source separation in a university community.
Ishii, Stephanie K L; Boyer, Treavor H
2016-09-01
Urine source separation, i.e., the collection and treatment of human urine as a separate waste stream, has the potential to improve many aspects of water resource management and wastewater treatment. However, social considerations must be taken into consideration for successful implementation of this alternative wastewater system. This work evaluated the perceptions of urine source separation held by students living on-campus at a major university in the Southeastern region of the United States. Perceptions were evaluated in the context of the Theory of Planned Behavior. The survey population represents one group within a community type (universities) that is expected to be an excellent testbed for urine source separation. Overall, respondents reported high levels of support for urine source separation after watching a video on expected benefits and risks, e.g., 84% indicated that they would vote in favor of urine source separation in residence halls. Support was less apparent when measured by willingness to pay, as 33% of respondents were unwilling to pay for the implementation of urine source separation and 40% were only willing to pay $1 to $10 per semester. Water conservation was largely identified as the most important benefit of urine source separation and there was little concern reported about the use of urine-based fertilizers. Statistical analyses showed that one's environmental attitude, environmental behavior, perceptions of support within the university community, and belief that student opinions have an impact on university decision makers were significantly correlated with one's support for urine source separation. This work helps identify community characteristics that lend themselves to acceptance of urine source separation, such as those related to environmental attitudes/behaviors and perceptions of behavioral control and subjective norm. Critical aspects of these alternative wastewater systems that require attention in order to foster public
Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm
Directory of Open Access Journals (Sweden)
Lei Chen
2014-01-01
Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.
Empirical Study on Factors Influencing Residents' Behavior of Separating Household Wastes at Source
Institute of Scientific and Technical Information of China (English)
Qu Ying; Zhu Qinghua; Murray Haight
2007-01-01
Source separation is the basic premise for making effective use of household wastes. In eight cities of China, however, several pilot projects of source separation finally failed because of the poor participation rate of residents. In order to solve this problem, identifying those factors that influence residents' behavior of source separation becomes crucial. By means of questionnaire survey, we conducted descriptive analysis and exploratory factor analysis. The results show that trouble-feeling, moral notion, environment protection, public education, environment value and knowledge deficiency are the main factors that play an important role for residents in deciding to separate their household wastes. Also, according to the contribution percentage of the six main factors to the total behavior of source separation, their influencing power is analyzed, which will provide suggestions on household waste management for policy makers and decision makers in China.
Source Separation and Higher-Order Causal Analysis of MEG and EEG
Zhang, Kun
2012-01-01
Separation of the sources and analysis of their connectivity have been an important topic in EEG/MEG analysis. To solve this problem in an automatic manner, we propose a two-layer model, in which the sources are conditionally uncorrelated from each other, but not independent; the dependence is caused by the causality in their time-varying variances (envelopes). The model is identified in two steps. We first propose a new source separation technique which takes into account the autocorrelations (which may be time-varying) and time-varying variances of the sources. The causality in the envelopes is then discovered by exploiting a special kind of multivariate GARCH (generalized autoregressive conditional heteroscedasticity) model. The resulting causal diagram gives the effective connectivity between the separated sources; in our experimental results on MEG data, sources with similar functions are grouped together, with negative influences between groups, and the groups are connected via some interesting sources.
Semi-blind Source Separation Using Head-Related Transfer Functions
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Hansen, Lars Kai; Kjems, Ulrik;
2004-01-01
An online blind source separation algorithm which is a special case of the geometric algorithm by Parra and Fancourt has been implemented for the purpose of separating sounds recorded at microphones placed at each side of the head. By using the assumption that the position of the two sounds...... are known, the source separation algorithm has been geometrically constrained. Since the separation takes place in a non free-field, a head-related transfer function (HRTF) is used to simulate the response between microphones placed at the two ears. The use of a HRTF instead of assuming free-field improves...
Separation of radiation from two sources from their known radiated sum field
DEFF Research Database (Denmark)
Laitinen, Tommi; Pivnenko, Sergey
2011-01-01
This paper presents a technique for complete and exact separation of the radiated fields of two sources (at the same frequency) from the knowledge of their radiated sum field. The two sources can be arbitrary but it must be possible to enclose the sources inside their own non-intersecting minimum...
Monaural separation of dependent audio sources based on a generalized Wiener filter
DEFF Research Database (Denmark)
Ma, Guilin; Agerkvist, Finn T.; Luther, J.B.
2007-01-01
) coefficients of the dependent sources is modeled by complex Gaussian mixture models in the frequency domain from samples of individual sources to capture the properties of the sources and their correlation. During the second stage, the mixture is separated through a generalized Wiener filter, which takes...
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. PMID:25819930
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Blind separation of sound sources from the principle of least spatial entropy
Dong, Bin; Antoni, Jérôme; Zhang, Erliang
2014-04-01
The aim of the paper is to offer a method for separating incoherent and compact sound sources which may overlap in both the space and frequency domains. This is found of interest in acoustical applications involving the identification and ranking of sound sources stemming from different physical origins. The principle proceeds in two steps, the first one being reminiscent to source reconstruction (e.g. as in near-field acoustical holography) and the second one to blind source separation. Specifically, the source mixture is first expanded into a linear combination of spatial basis functions whose coefficients are set by backpropagating the pressures measured by an array of microphones to the source domain. This leads to a formulation similar, but no identical, to blind source separation. In the second step, these coefficients are blindly separated into uncorrelated latent variables, assigned to incoherent "virtual sources". These are shown to be defined up to an arbitrary rotation. A unique set of sound sources is finally recovered by searching for that rotation (by conjugate gradient descent in the Stiefel manifold of unitary matrices) which maximizes their spatial compactness, as measured either by their spatial variance or their spatial entropy. This results in the proposal of two separation criteria coined "least spatial variance" and "least spatial entropy", respectively. The same concept of spatial entropy, which is central to the paper, is also exploited in defining a new criterion, the entropic L-curve, dedicated to determining the number of active sound sources. The idea consists in considering the number of sources that achieves the best compromise between a low spatial entropy (as expected from compact sources) and a low statistical entropy (as expected from a low residual error). The proposed methodology is validated on both laboratory experiments and numerical data, and illustrated on an industrial example concerned with the ranking of sound sources on
Source Separation and Clustering of Phase-Locked Subspaces: Derivations and Proofs
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José; Vigário, Ricardo
2011-01-01
Due to space limitations, our submission "Source Separation and Clustering of Phase-Locked Subspaces", accepted for publication on the IEEE Transactions on Neural Networks in 2011, presented some results without proof. Those proofs are provided in this paper.
Fate of pharmaceuticals in full-scale source separated sanitation system
Butkovskyi, A.; Hernandez Leal, L.; Rijnaarts, H.H.M.; Zeeman, G.
2015-01-01
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen
Phase recovery in NMF for audio source separation: an insightful benchmark
Magron, Paul; Badeau, Roland; David, Bertrand
2016-01-01
Nonnegative Matrix Factorization (NMF) is a powerful tool for decomposing mixtures of audio signals in the Time-Frequency (TF) domain. In applications such as source separation, the phase recovery for each extracted component is a major issue since it often leads to audible artifacts. In this paper, we present a methodology for evaluating various NMF-based source separation techniques involving phase reconstruction. For each model considered, a comparison between two approaches (blind separat...
Directory of Open Access Journals (Sweden)
Wang Pidong
2016-01-01
Full Text Available Blind source separation is a hot topic in signal processing. Most existing works focus on dealing with linear combined signals, while in practice we always encounter with nonlinear mixed signals. To address the problem of nonlinear source separation, in this paper we propose a novel algorithm using radial basis function neutral network, optimized by multi-universe parallel quantum genetic algorithm. Experiments show the efficiency of the proposed method.
A cost evaluation method for transferring municipalities to solid waste source-separated system.
Lavee, Doron; Nardiya, Shlomit
2013-05-01
Most of Israel's waste is disposed in landfills, threatening scarce land resources and posing environmental and health risks. The aim of this study is to estimate the expected costs of transferring municipalities to solid waste source separation in Israel, aimed at reducing the amount of waste directed to landfills and increasing the efficiency and amount of recycled waste. Information on the expected costs of operating a solid waste source separation system was gathered from 47 municipalities and compiled onto a database, taking into consideration various factors such as costs of equipment, construction adjustments and waste collection and disposal. This database may serve as a model for estimating the costs of entering the waste source separation system for any municipality in Israel, while taking into consideration its specific characteristics, such as size and region. The model was used in Israel for determining municipalities' eligibility to receive a governmental grant for entering an accelerated process of solid waste source separation. This study displays a user-friendly and simple operational tool for assessing municipalities' costs of entering a process of waste source separation, providing policy makers a powerful tool for diverting funds effectively in promoting solid waste source separation. PMID:23465315
Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices
Directory of Open Access Journals (Sweden)
Baccigalupi C
2005-01-01
Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.
A Modified Infomax ICA Algorithm for fMRI Data Source Separation
Directory of Open Access Journals (Sweden)
Amir A. Khaliq
2013-05-01
Full Text Available This study presents a modified infomax model of Independent Component Analysis (ICA for the source separation problem of fMRI data. Functional MRI data is processed by different blind source separation techniques including Independent Component Analysis (ICA. ICA is a statistical decomposition method used for multivariate data source separation. ICA algorithm is based on independence of extracted sources for which different techniques are used like kurtosis, negentropy, information maximization etc. The infomax method of ICA extracts unknown sources from a number of mixtures by maximizing the negentropy thus ensuring independence. In this proposed modified infomax model a higher order contrast function is used which results in fast convergence and accuracy. The Proposed algorithm is applied to general simulated signals and simulated fMRI data. Comparison of correlation results of the proposed algorithm with the conventional infomax algorithm shows better performance.
Servière, C.; Lacoume, J.-L.; El Badaoui, M.
2005-11-01
This paper is devoted to blind separation of combustion noise and piston-slap in diesel engines. The two phenomena are recovered only from signals issued from accelerometers placed on one of the cylinders. A blind source separation (BSS) method is developed, based on a convolutive model of non-stationary mixtures. We introduce a new method based on the joint diagonalisation of the time varying spectral matrices of the observation records and a new technique to handle the problem of permutation ambiguity in the frequency domain. This method is then applied to real data and the estimated sources are validated by several physical arguments. So, the contribution of the piston-slap and the combustion noise can be recovered for all the sensors. The energy of the two phenomena can then be given with regards to the position of the accelerometers.
Bamber, J. L.; Schoen, N.; Zammit-Mangion, A.; Rougier, J.; Flament, T.; Luthcke, S. B.; Petrie, E. J.; Rémy, F.
2013-12-01
There remains considerable inconsistency between different methods and approaches for determining ice mass trends for Antarctica from satellite observations. There are three approaches that can provide near global coverage for mass trends: altimetry, gravimetry and mass budget calculations. All three approaches suffer from a source separation problem where other geophysical processes limit the capability of the method to resolve the origin and magnitude of a mass change. A fourth approach, GPS vertical motion, provides localised estimates of mass change due to elastic uplift and an indirect estimate of GIA. Each approach has different source separation issues and different spatio-temporal error characteristics. In principle, it should be possible to combine the data and process covariances to minimize the uncertainty in the solution and to produce robust, posterior errors for the trends. In practice, this is a challenging problem in statistics because of the large number of degrees of freedom, the variable spatial and temporal sampling between the different observations and the fact that some processes remain under-sampled, such as firn compaction. Here, we present a novel solution to this problem using the latest methods in statistical modelling of spatio-temporal processes. We use Bayesian hierarchical modelling and employ stochastic partial differential equations to capture our physical understanding of the key processes that influence our observations. Due to the huge number of observations involved (> 10^8) methods are required to reduce the dimensionality of the problem and care is required in treatment of the observations as they are not independent. Here, we focus mainly on the results rather than the full suite of methods and we present time evolving fields of surface mass balance, ice dynamic-driven mass loss, and firn compaction for the period 2003-2009, derived from a combination of ICESat, ENVISAT, GRACE, InSAR, GPS and regional climate model output
Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study
Directory of Open Access Journals (Sweden)
Cees Buisman
2013-07-01
Full Text Available Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1 centralized; (2 centralized with source-separation of urine; (3 source-separation of black water, kitchen refuse and grey water; and (4 source-separation of urine, feces, kitchen refuse and grey water. The highest primary energy consumption of 914 MJ/capita(cap/year was attained within the centralized sanitation concept, and the lowest primary energy consumption of 437 MJ/cap/year was attained within source-separation of urine, feces, kitchen refuse and grey water. Grey water bio-flocculation and subsequent grey water sludge co-digestion decreased the primary energy consumption, but was not energetically favorable to couple with grey water effluent reuse. Source-separation of urine improved the energy balance, nutrient recovery and effluent quality, but required larger land area and higher chemical use in the centralized concept.
A Computational Auditory Scene Analysis-Enhanced Beamforming Approach for Sound Source Separation
Directory of Open Access Journals (Sweden)
L. A. Drake
2009-01-01
Full Text Available Hearing aid users have difficulty hearing target signals, such as speech, in the presence of competing signals or noise. Most solutions proposed to date enhance or extract target signals from background noise and interference based on either location attributes or source attributes. Location attributes typically involve arrival angles at a microphone array. Source attributes include characteristics that are specific to a signal, such as fundamental frequency, or statistical properties that differentiate signals. This paper describes a novel approach to sound source separation, called computational auditory scene analysis-enhanced beamforming (CASA-EB, that achieves increased separation performance by combining the complementary techniques of CASA (a source attribute technique with beamforming (a location attribute technique, complementary in the sense that they use independent attributes for signal separation. CASA-EB performs sound source separation by temporally and spatially filtering a multichannel input signal, and then grouping the resulting signal components into separated signals, based on source and location attributes. Experimental results show increased signal-to-interference ratio with CASA-EB over beamforming or CASA alone.
Non-Stationary Brain Source Separation for Multi-Class Motor Imagery
Gouy-Pailler, Cedric; Congedo, Marco; Brunner, Clemens; Jutten, Christian; Pfurtscheller, Gert
2010-01-01
International audience This article describes a method to recover taskrelated brain sources in the context of multi-class Brain- Computer Interfaces (BCIs) based on non-invasive electroencephalography (EEG). We extend the method Joint Approximate Diagonalization (JAD) for spatial filtering using a maximum likelihood framework. This generic formulation (1) bridges the gap between the Common Spatial Patterns (CSP) and Blind Source Separation (BSS) of non-stationary sources, and (2) leads to ...
ICAR, a tool for Blind Source Separation using Fourth Order Statistics only
Albera, Laurent; Férreol, Anne; Chevalier, Pascal; Comon, Pierre
2005-01-01
The problem of blind separation of overdetermined mixtures of sources, that is, with fewer sources than (or as many sources as) sensors, is addressed in this paper. A new method, named ICAR (Independent Component Analysis using Redundancies in the quadricovariance), is proposed in order to process complex data. This method, without any whitening operation, only exploits some redundancies of a particular quadricovariance matrix of the data. Computer simulations demonstrate that ICAR offers in ...
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Institute of Scientific and Technical Information of China (English)
Yunhan Luo; Houxin Cui; Xiaoyu Gu; Rong Liu; Kexin Xu
2005-01-01
Based on analysis of the relation between mean penetration depth and source-detector separation in a threelayer model with the method of Monte-Carlo simulation, an optimal source-detector separation is derived from the mean penetration depth referring to monitoring the change of chromophores concentration of the sandwiched layer. In order to verify the separation, we perform Monte-Carlo simulations with varied absorption coefficient of the sandwiched layer. All these diffuse reflectances are used to construct a calibration model with the method of partial least square (PLS). High correlation coefficients and low root mean square error of prediction (RMSEP) at the optimal separation have confirmed correctness of the selection. This technique is expected to show light on noninvasive diagnosis of near-infrared spectroscopy.
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Gran, Bjørn Axel
2002-01-01
The objective of the research has been to investigate the possibility to transfer the requirements of a software safety standard into Bayesian belief networks (BBNs). The BBN methodology has mainly been developed and applied in the AI society, but more recently it has been proposed to apply it to the assessment of programmable systems. The relation to AI application is relevant in the sense that the method reflects the way of an assessor's thinking during the assessment process. Conceptually,...
Subband-based Single-channel Source Separation of Instantaneous Audio Mixtures
Taghia, Jalil; Doostari, Mohammad Ali
2009-01-01
In this paper, a new algorithm is developed to separate the audio sources from a single instantaneous mixture. The algorithm is based on subband decomposition and uses a hybrid system of Empirical Mode Decomposition (EMD) and Principle Component Analysis (PCA) to construct artificial observations from the single mixture. In the separation stage of algorithm, we use Independent Component Analysis (ICA) to find independent components. At first the observed mixture is divided into a finite numbe...
Blind source separation of ship-radiated noise based on generalized Gaussian model
Institute of Scientific and Technical Information of China (English)
Kong Wei; Yang Bin
2006-01-01
When the distribution of the sources cannot be estimated accurately, the ICA algorithms failed to separate the mixtures blindly. The generalized Gaussian model (GGM) is presented in ICA algorithm since it can model nonGaussian statistical structure of different source signals easily. By inferring only one parameter, a wide class of statistical distributions can be characterized. By using maximum likelihood (ML) approach and natural gradient descent, the learning rules of blind source separation (BSS) based on GGM are presented. The experiment of the ship-radiated noise demonstrates that the GGM can model the distributions of the ship-radiated noise and sea noise efficiently, and the learning rules based on GGM gives more successful separation results after comparing it with several conventional methods such as high order cumulants and Gaussian mixture density function.
Institute of Scientific and Technical Information of China (English)
黄晋英; 潘宏侠; 毕世华; 杨喜旺
2008-01-01
Blind source separation (BBS) technology was applied to vibration signal processing of gearbox for separating different fault vibration sources and enhancing fault information. An improved BSS algorithm based on particle swarm optimization (PSO) was proposed. It can change the traditional fault-enhancing thought based on de-noising. And it can also solve the practical difficult problem of fault location and low fault diagnosis rate in early stage. It was applied to the vibration signal of gearbox under three working states. The result proves that the BSS greatly enhances fault information and supplies technological method for diagnosis of weak fault.
Separation of beam and electrons in the spallation neutron source H- ion source
International Nuclear Information System (INIS)
The Spallation Neutron Source (SNS) requires an ion source producing an H- beam with a peak current of 35 mA at a 6.2% duty factor. For the design of this ion source, extracted electrons must be transported and dumped without adversely affecting the H- beam optics. Two issues are considered: (1) electron containment transport and controlled removal; and (2) first-order H- beam steering. For electron containment, various magnetic, geometric and electrode biasing configurations are analyzed. A kinetic description for the negative ions and electrons is employed with self-consistent fields obtained from a steady-state solution to Poisson's equation. Guiding center electron trajectories are used when the gyroradius is sufficiently small. The magnetic fields used to control the transport of the electrons and the asymmetric sheath produced by the gyrating electrons steer the ion beam. Scenarios for correcting this steering by split acceleration and focusing electrodes will be considered in some detail
Blind Source Separation with Conjugate Gradient Algorithm and Kurtosis Maximization Criterion
Directory of Open Access Journals (Sweden)
Sanjeev N Jain
2016-02-01
Full Text Available Blind source separation (BSS is a technique for estimating individual source components from their mixtures at multiple sensors. It is called blind because any additional other information will not be used besides the mixtures. Recently, blind source separation has received attention because of its potential applications in signal processing such as in speech recognition systems, telecommunications and medical signal processing. Blind source separation of super and sub-Gaussian Signal is proposed utilizing conjugate gradient algorithm and kurtosis maximization criteria. In our previous paper, ABC algorithm was utilized to blind source separation and here, we improve the technique with changes in fitness function and scout bee phase. Fitness function is improved with the use of kurtosis maximization criterion and scout bee phase is improved with use of conjugate gradient algorithm. The evaluation metrics used for performance evaluation are fitness function values and distance values. Comparative analysis is also carried out by comparing our proposed technique to other prominent techniques. The technique achieved average distance of 38.39, average fitness value of 6.94, average Gaussian distance of 58.60 and average Gaussian fitness as 5.02. The technique attained lowest average distance value among all techniques and good values for all other evaluation metrics which shows the effectiveness of the proposed technique.
Gearbox Fault Diagnosis in a Wind Turbine Using Single Sensor Based Blind Source Separation
Directory of Open Access Journals (Sweden)
Yuning Qian
2016-01-01
Full Text Available This paper presents a single sensor based blind source separation approach, namely, the wavelet-assisted stationary subspace analysis (WSSA, for gearbox fault diagnosis in a wind turbine. Continuous wavelet transform (CWT is used as a preprocessing tool to decompose a single sensor measurement data into a set of wavelet coefficients to meet the multidimensional requirement of the stationary subspace analysis (SSA. The SSA is a blind source separation technique that can separate the multidimensional signals into stationary and nonstationary source components without the need for independency and prior information of the source signals. After that, the separated nonstationary source component with the maximum kurtosis value is analyzed by the enveloping spectral analysis to identify potential fault-related characteristic frequencies. Case studies performed on a wind turbine gearbox test system verify the effectiveness of the WSSA approach and indicate that it outperforms independent component analysis (ICA and empirical mode decomposition (EMD, as well as the spectral-kurtosis-based enveloping, for wind turbine gearbox fault diagnosis.
Separating Turbofan Engine Noise Sources Using Auto and Cross Spectra from Four Microphones
Miles, Jeffrey Hilton
2008-01-01
The study of core noise from turbofan engines has become more important as noise from other sources such as the fan and jet were reduced. A multiple-microphone and acoustic-source modeling method to separate correlated and uncorrelated sources is discussed. The auto- and cross spectra in the frequency range below 1000 Hz are fitted with a noise propagation model based on a source couplet consisting of a single incoherent monopole source with a single coherent monopole source or a source triplet consisting of a single incoherent monopole source with two coherent monopole point sources. Examples are presented using data from a Pratt& Whitney PW4098 turbofan engine. The method separates the low-frequency jet noise from the core noise at the nozzle exit. It is shown that at low power settings, the core noise is a major contributor to the noise. Even at higher power settings, it can be more important than jet noise. However, at low frequencies, uncorrelated broadband noise and jet noise become the important factors as the engine power setting is increased.
Ahuja, Chaitanya; Nathwani, Karan; Rajesh M. Hegde
2014-01-01
Conventional NMF methods for source separation factorize the matrix of spectral magnitudes. Spectral Phase is not included in the decomposition process of these methods. However, phase of the speech mixture is generally used in reconstructing the target speech signal. This results in undesired traces of interfering sources in the target signal. In this paper the spectral phase is incorporated in the decomposition process itself. Additionally, the complex matrix factorization problem is reduce...
On-line isotope separation. Tests for targets and ion sources compatibility
International Nuclear Information System (INIS)
We have performed a compilation of the influence of various parameters on suitable targets (composition, structure and nuclear constraint) for fission and spallation reactions induced by charged particles. In that case, targets are generally located near or inside the ionization chamber. A survey of typical ions sources and separators particularly used with heavy ion beams is given. These sources are often feeded either by a helium jet transport system or by a catcher foil
Directory of Open Access Journals (Sweden)
Takuya Isomura
2015-12-01
Full Text Available Blind source separation is the computation underlying the cocktail party effect--a partygoer can distinguish a particular talker's voice from the ambient noise. Early studies indicated that the brain might use blind source separation as a signal processing strategy for sensory perception and numerous mathematical models have been proposed; however, it remains unclear how the neural networks extract particular sources from a complex mixture of inputs. We discovered that neurons in cultures of dissociated rat cortical cells could learn to represent particular sources while filtering out other signals. Specifically, the distinct classes of neurons in the culture learned to respond to the distinct sources after repeating training stimulation. Moreover, the neural network structures changed to reduce free energy, as predicted by the free-energy principle, a candidate unified theory of learning and memory, and by Jaynes' principle of maximum entropy. This implicit learning can only be explained by some form of Hebbian plasticity. These results are the first in vitro (as opposed to in silico demonstration of neural networks performing blind source separation, and the first formal demonstration of neuronal self-organization under the free energy principle.
Oweiss, Karim G.; Anderson, David J.
2006-12-01
We investigate a new approach for the problem of source separation in correlated multichannel signal and noise environments. The framework targets the specific case when nonstationary correlated signal sources contaminated by additive correlated noise impinge on an array of sensors. Existing techniques targeting this problem usually assume signal sources to be independent, and the contaminating noise to be spatially and temporally white, thus enabling orthogonal signal and noise subspaces to be separated using conventional eigendecomposition. In our context, we propose a solution to the problem when the sources are nonorthogonal, and the noise is correlated with an unknown temporal and spatial covariance. The approach is based on projecting the observations onto a nested set of multiresolution spaces prior to eigendecomposition. An inherent invariance property of the signal subspace is observed in a subset of the multiresolution spaces that depends on the degree of approximation expressed by the orthogonal basis. This feature, among others revealed by the algorithm, is eventually used to separate the signal sources in the context of "best basis" selection. The technique shows robustness to source nonstationarities as well as anisotropic properties of the unknown signal propagation medium under no constraints on the array design, and with minimal assumptions about the underlying signal and noise processes. We illustrate the high performance of the technique on simulated and experimental multichannel neurophysiological data measurements.
Yuan, Yalin; Yabe, Mitsuyasu
2014-01-01
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly. PMID:25546279
Yuan, Yalin; Yabe, Mitsuyasu
2014-12-23
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents' preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
Directory of Open Access Journals (Sweden)
Yalin Yuan
2014-12-01
Full Text Available A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
DEFF Research Database (Denmark)
Glover, Kevin A.; Hansen, Michael Møller; Skaala, Oystein
2009-01-01
. Accuracy of assignment varied greatly among the individual samples. For the Bayesian clustered data set consisting of five genetic groups, overall accuracy of self-assignment was 99%, demonstrating the effectiveness of this strategy to significantly increase accuracy of assignment, albeit at the expense....... Performing self-assignment simulations with the data divided into different sub-sets, overall accuracy of assignment was 44% within the entire material (44 samples), 44% for the 28 spring samples, 59% for the 16 autumn samples, and 70% for 8 autumn samples collected from a geographically restricted area...
Co-Parenting: Sharing Your Child Equally. A Source Book for the Separated or Divorced Family.
Galper, Miriam
This source book introduces perspectives and skills which can contribute to successful "co-parenting" (joint custody, joint parenting, co-custody or shared custody) of preadolescent children after parents are separated or divorced. Chapter One introduces the concept of co-parenting. Chapter Two advances an approach to developing flexible…
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
Yang Junan; Li Bin; Zhuang Zhenquan
2003-01-01
This letter proposes two algorithms: a novel Quantum Genetic Algorithm (QGA)based on the improvement of Han's Genetic Quantum Algorithm (GQA) and a new Blind Source Separation (BSS) method based on QGA and Independent Component Analysis (ICA). The simulation result shows that the efficiency of the new BSS method is obviously higher than that of the Conventional Genetic Algorithm (CGA).
Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Zhang Yimin
2006-01-01
Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.
Blind source separation of fMRI data by means of factor analytic transformations
Langers, Dave R. M.
2009-01-01
In this study, the application of factor analytic (FA) rotation methods in the context of neuroimaging data analysis was explored. Three FA algorithms (ProMax, QuartiMax, and VariMax) were employed to carry out blind source separation in a functional magnetic resonance imaging (fMRI) experiment that
ENVIRONMENTAL EFFICIENCY, SEPARABILITY AND ABATEMENT COSTS OF NON-POINT SOURCE POLLUTION
Wossink, Ada; Denaux, Zulal Sogutlu
2002-01-01
This paper presents a new framework for analyzing abatement costs of nonpoint-source pollution. Unlike previous studies, this framework treats production and pollution as non-separable and also recognizes that production inefficiency is a fundamental cause of pollution. The implications of this approach are illustrated using an empirical analysis for cotton producers.
Micropollutant removal in an algal treatment system fed with source separated wastewater streams
de Wilt, Arnoud; Butkovskyi, Andrii; Tuantet, Kanjana; Hernandez Leal, Lucia; Fernandes, T.V.; Langenhoff, Alette; Zeeman, Grietje
2016-01-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceu
Resource recovery from source separated domestic waste(water) streams; Full scale results
Zeeman, G.; Kujawa, K.
2011-01-01
A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2013-01-01
The environmental performance of two pretreatment technologies for source-separated organic waste was compared using life cycle assessment (LCA). An innovative pulping process where source-separated organic waste is pulped with cold water forming a volatile solid rich biopulp was compared to a more...... including a number of non-toxic and toxic impact categories were assessed. No big difference in the overall performance of the two technologies was observed. The difference for the separate life cycle steps was, however, more pronounced. More efficient material transfer in the scenario with waste pulping...... resulted in a higher biogas output and nutrient recovery and, thus, the higher impact savings related to biogas production and digest utilization. Meanwhile, larger reject amount in the scenario with screw press led to more savings obtained by utilization of the reject in this scenario....
Blind separation of sources in nonlinear convolved mixture based on a novel network
Institute of Scientific and Technical Information of China (English)
胡英; 杨杰; 沈利
2004-01-01
Blind separation of independent sources from their nonlinear convoluted mixtures is a more realistic problem than from linear ones. A solution to this problem based on the Entropy Maximization principle is presented. First we propose a novel two-layer network as the de-mixing system to separate sources in nonlinear convolved mixture. In output layer of our network we use feedback network architecture to cope with convoluted mixtures. Then we derive learning algorithms for the two-layer network by maximizing the information entropy. Based on the comparison of the computer simulation results, it can be concluded that the proposed algorithm has a better nonlinear convolved blind signal separation effect than the H.H. Y' s algorithm.
Difficulties applying recent blind source separation techniques to EEG and MEG
Knuth, Kevin H
2015-01-01
High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...
Tang, Gang; Luo, Ganggang; Zhang, Weihua; Yang, Caijin; Wang, Huaqing
2016-01-01
In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method. PMID:27322268
Directory of Open Access Journals (Sweden)
Gang Tang
2016-06-01
Full Text Available In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method.
Yan, Jun; Dong, Danan; Chen, Wen
2016-04-01
Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.
Wang, Fa-Yu; Chi, Chong-Yung; Chan, Tsung-Han; Wang, Yue
2010-05-01
Although significant efforts have been made in developing nonnegative blind source separation techniques, accurate separation of positive yet dependent sources remains a challenging task. In this paper, a joint correlation function of multiple signals is proposed to reveal and confirm that the observations after nonnegative mixing would have higher joint correlation than the original unknown sources. Accordingly, a new nonnegative least-correlated component analysis (n/LCA) method is proposed to design the unmixing matrix by minimizing the joint correlation function among the estimated nonnegative sources. In addition to a closed-form solution for unmixing two mixtures of two sources, the general algorithm of n/LCA for the multisource case is developed based on an iterative volume maximization (IVM) principle and linear programming. The source identifiability and required conditions are discussed and proven. The proposed n/LCA algorithm, denoted by n/LCA-IVM, is evaluated with both simulation data and real biomedical data to demonstrate its superior performance over several existing benchmark methods. PMID:20299711
Directory of Open Access Journals (Sweden)
Y. Yokoo
2014-09-01
Full Text Available This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring station on the Abukuma River, Japan. The results of the methods corresponded well in terms of both surface flow components and inter-flow components. In terms of the baseflow component, the result of the time source separation method corresponded with the moving average of the baseflow calculated by the geographic source separation method. These results suggest that the time source separation method is not only able to estimate numerical values for the discharge components, but that the estimates are also reasonable from a geographical viewpoint in the 3000 km2 watershed discussed in this study. The consistent results obtained using the time source and geographic source separation methods demonstrate that it is possible to characterize dominant runoff processes using hourly discharge data, thereby enhancing our capability to interpret the dominant runoff processes of a watershed using observed discharge data alone.
Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA
Institute of Scientific and Technical Information of China (English)
You Rong-Yi; Chen Zhong
2005-01-01
Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
YangJunan; LiBin; 等
2003-01-01
This letter proposes two algorithns:a novel Quantum Genetic Algorithm(QGA)based on the improvement of Han's Genetic Quantum Algorithm(GQA)and a new Blind Source Separation(BSS)method based on QGA and Independent Component Analysis(ICA).The simulation result shows that the efficiency of the new BSS nethod is obviously higher than that of the Conventional Genetic Algorithm(CGA).
Estimating International Tourism Demand to Spain Separately by the Major Source Markets
Marcos Alvarez-Díaz; Manuel González-Gómez; Mª Soledad Otero-Giraldez
2012-01-01
The objective of this paper is to estimate international tourism demand to Spain separately by major source markets (Germany, United Kingdom, France, Italy and The Netherlands) that represent 67% of the international tourism to Spain. In order to investigate how the tourism demand reacts to price and income changes, we apply the bounds testing approach to cointegration and construct confidence intervals using the bootstrap technique. The results show differences in tourism behavior depending ...
Congedo, Marco; Gouy-Pailler, Cedric; Jutten, Christian
2008-01-01
Over the last ten years blind source separation (BSS) has become a prominent processing tool in the study of human electroencephalography (EEG). Without relying on head modeling BSS aims at estimating both the waveform and the scalp spatial pattern of the intracranial dipolar current responsible of the observed EEG. In this review we begin by placing the BSS linear instantaneous model of EEG within the framework of brain volume conduction theory. We then review the concept and current practic...
Role of the source to building lateral separation distance in petroleum vapor intrusion
Verginelli, Iason; Capobianco, Oriana; Baciocchi, Renato
2016-06-01
The adoption of source to building separation distances to screen sites that need further field investigation is becoming a common practice for the evaluation of the vapor intrusion pathway at sites contaminated by petroleum hydrocarbons. Namely, for the source to building vertical distance, the screening criteria for petroleum vapor intrusion have been deeply investigated in the recent literature and fully addressed in the recent guidelines issued by ITRC and U.S.EPA. Conversely, due to the lack of field and modeling studies, the source to building lateral distance received relatively low attention. To address this issue, in this work we present a steady-state vapor intrusion analytical model incorporating a piecewise first-order aerobic biodegradation limited by oxygen availability that accounts for lateral source to building separation. The developed model can be used to evaluate the role and relevance of lateral vapor attenuation as well as to provide a site-specific assessment of the lateral screening distances needed to attenuate vapor concentrations to risk-based values. The simulation outcomes showed to be consistent with field data and 3-D numerical modeling results reported in previous studies and, for shallow sources, with the screening criteria recommended by U.S.EPA for the vertical separation distance. Indeed, although petroleum vapors can cover maximum lateral distances up to 25-30 m, as highlighted by the comparison of model outputs with field evidences of vapor migration in the subsurface, simulation results by this new model indicated that, regardless of the source concentration and depth, 6 m and 7 m lateral distances are sufficient to attenuate petroleum vapors below risk-based values for groundwater and soil sources, respectively. However, for deep sources (> 5 m) and for low to moderate source concentrations (benzene concentrations lower than 5 mg/L in groundwater and 0.5 mg/kg in soil) the above criteria were found extremely conservative as
Directory of Open Access Journals (Sweden)
Duofang Chen
2015-01-01
Full Text Available By recording a time series of tomographic images, dynamic fluorescence molecular tomography (FMT allows exploring perfusion, biodistribution, and pharmacokinetics of labeled substances in vivo. Usually, dynamic tomographic images are first reconstructed frame by frame, and then unmixing based on principle component analysis (PCA or independent component analysis (ICA is performed to detect and visualize functional structures with different kinetic patterns. PCA and ICA assume sources are statistically uncorrelated or independent and don’t perform well when correlated sources are present. In this paper, we deduce the relationship between the measured imaging data and the kinetic patterns and present a temporal unmixing approach, which is based on nonnegative blind source separation (BSS method with a convex analysis framework to separate the measured data. The presented method requires no assumption on source independence or zero correlations. Several numerical simulations and phantom experiments are conducted to investigate the performance of the proposed temporal unmixing method. The results indicate that it is feasible to unmix the measured data before the tomographic reconstruction and the BSS based method provides better unmixing quality compared with PCA and ICA.
Energy Technology Data Exchange (ETDEWEB)
Biollaz, S.; Ludwig, Ch.; Stucki, S. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1999-08-01
A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.
Energy Technology Data Exchange (ETDEWEB)
Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I. (Univ. of Pittsburgh, PA); Heintz, Y.J. (Univ. of Pittsburgh, PA); Ilconich, J.B. (Parsons)
2007-06-01
The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Resonance ionization laser ion sources for on-line isotope separators (invited).
Marsh, B A
2014-02-01
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented.
Resonance ionization laser ion sources for on-line isotope separators (invited)
International Nuclear Information System (INIS)
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented
Non-Cancellation Multistage Kurtosis Maximization with Prewhitening for Blind Source Separation
Directory of Open Access Journals (Sweden)
Xiang Chen
2009-01-01
Full Text Available Chi et al. recently proposed two effective non-cancellation multistage (NCMS blind source separation algorithms, one using the turbo source extraction algorithm (TSEA, called the NCMS-TSEA, and the other using the fast kurtosis maximization algorithm (FKMA, called the NCMS-FKMA. Their computational complexity and performance heavily depend on the dimension of multisensor data, that is, number of sensors. This paper proposes the inclusion of the prewhitening processing in the NCMS-TSEA and NCMS-FKMA prior to source extraction. We come up with four improved algorithms, referred to as the PNCMS-TSEA, the PNCMS-FKMA, the PNCMS-TSEA(p, and the PNCMS-FKMA(p. Compared with the existing NCMS-TSEA and NCMS-FKMA, the former two algorithms perform with significant computational complexity reduction and some performance improvements. The latter two algorithms are generalized counterparts of the former two algorithms with the single source extraction module replaced by a bank of source extraction modules in parallel at each stage. In spite of the same performance of PNCMS-TSEA and PNCMS-TSEA(p (PNCMS-FKMA and PNCMS-FKMA(p, the merit of this parallel source extraction structure lies in much shorter processing latency making the PNCMS-TSEA(p and PNCMS-FKMA(p well suitable for software and hardware implementations. Some simulation results are presented to verify the efficacy and computational efficiency of the proposed algorithms.
Płuska, Mariusz; Czerwinski, Andrzej; Ratajczak, Jacek; Katcki, Jerzy; Oskwarek, Lukasz; Rak, Remigiusz
2009-01-01
The electron-microscope image distortion generated by electromagnetic interference (EMI) is an important problem for accurate imaging in scanning electron microscopy (SEM). Available commercial solutions to this problem utilize sophisticated hardware for EMI detection and compensation. Their efficiency depends on the complexity of distortions influence on SEM system. Selection of a proper method for reduction of the distortions is crucial. The current investigations allowed for a separation of the distortions impact on several components of SEM system. A sum of signals from distortion sources causes wavy deformations of specimen shapes in SEM images. The separation of various reasons of the distortion is based on measurements of the periodic deformations of the images for different electron beam energies and working distances between the microscope final aperture and the specimen. Using the SEM images, a direct influence of alternating magnetic field on the electron beam was distinguished. Distortions of electric signals in the scanning block of SEM were also separated. The presented method separates the direct magnetic field influence on the electron beam below the SEM final aperture (in the chamber) from its influence above this aperture (in the electron column). It also allows for the measurement of magnetic field present inside the SEM chamber. The current investigations gave practical guidelines for selecting the most efficient solution for reduction of the distortions.
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG
Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397
Ayllón, David; Gil-Pita, Roberto; Rosa-Zurera, Manuel
2013-12-01
A recent trend in hearing aids is the connection of the left and right devices to collaborate between them. Binaural systems can provide natural binaural hearing and support the improvement of speech intelligibility in noise, but they require data transmission between both devices, which increases the power consumption. This paper presents a novel sound source separation algorithm for binaural speech enhancement based on supervised machine learning and time-frequency masking. The system is designed considering the power restrictions in hearing aids, constraining both the computational cost of the algorithm and the transmission bit rate. The transmission schema is optimized using a tailored evolutionary algorithm that assigns a different number of bits to each frequency band. The proposed algorithm requires less than 10% of the available computational resources for signal processing and obtains good separation performance using bit rates lower than 64 kbps.
Rey, Valentine; Rey, Christian
2016-01-01
This article deals with the computation of guaranteed lower bounds of the error in the framework of finite element (FE) and domain decomposition (DD) methods. In addition to a fully parallel computation, the proposed lower bounds separate the algebraic error (due to the use of a DD iterative solver) from the discretization error (due to the FE), which enables the steering of the iterative solver by the discretization error. These lower bounds are also used to improve the goal-oriented error estimation in a substructured context. Assessments on 2D static linear mechanic problems illustrate the relevance of the separation of sources of error and the lower bounds' independence from the substructuring. We also steer the iterative solver by an objective of precision on a quantity of interest. This strategy consists in a sequence of solvings and takes advantage of adaptive remeshing and recycling of search directions.
AN EME BLIND SOURCE SEPARATION ALGORITHM BASED ON GENERALIZED EXPONENTIAL FUNCTION
Institute of Scientific and Technical Information of China (English)
Miao Hao; Li Xiaodong; Tian Jing
2008-01-01
This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid function as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.
Natural gradient-based recursive least-squares algorithm for adaptive blind source separation
Institute of Scientific and Technical Information of China (English)
ZHU Xiaolong; ZHANG Xianda; YE Jimin
2004-01-01
This paper focuses on the problem of adaptive blind source separation (BSS).First, a recursive least-squares (RLS) whitening algorithm is proposed. By combining it with a natural gradient-based RLS algorithm for nonlinear principle component analysis (PCA), and using reasonable approximations, a novel RLS algorithm which can achieve BSS without additional pre-whitening of the observed mixtures is obtained. Analyses of the equilibrium points show that both of the RLS whitening algorithm and the natural gradient-based RLS algorithm for BSS have the desired convergence properties. It is also proved that the combined new RLS algorithm for BSS is equivariant and has the property of keeping the separating matrix from becoming singular. Finally, the effectiveness of the proposed algorithm is verified by extensive simulation results.
DEFF Research Database (Denmark)
Larsen, Anna Warberg; Astrup, Thomas
2011-01-01
variations between emission factors for different incinerators, but the background for these variations has not been thoroughly examined. One important reason may be variations in collection of recyclable materials as source separation alters the composition of the residual waste incinerated. The objective...... routed to incineration. Emission factors ranged from 27 to 40kg CO2/GJ. The results appeared most sensitive towards variations in waste composition and water content. Recycling rates and lower heating values could not be used as simple indicators of the resulting emission factors for residual household...
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Generic Uniqueness of a Structured Matrix Factorization and Applications in Blind Source Separation
Domanov, Ignat; Lathauwer, Lieven De
2016-06-01
Algebraic geometry, although little explored in signal processing, provides tools that are very convenient for investigating generic properties in a wide range of applications. Generic properties are properties that hold "almost everywhere". We present a set of conditions that are sufficient for demonstrating the generic uniqueness of a certain structured matrix factorization. This set of conditions may be used as a checklist for generic uniqueness in different settings. We discuss two particular applications in detail. We provide a relaxed generic uniqueness condition for joint matrix diagonalization that is relevant for independent component analysis in the underdetermined case. We present generic uniqueness conditions for a recently proposed class of deterministic blind source separation methods that rely on mild source models. For the interested reader we provide some intuition on how the results are connected to their algebraic geometric roots.
Escolano, Jose; Xiang, Ning; Perez-Lorenzo, Jose M; Cobos, Maximo; Lopez, Jose J
2014-02-01
Sound source localization using a two-microphone array is an active area of research, with considerable potential for use with video conferencing, mobile devices, and robotics. Based on the observed time-differences of arrival between sound signals, a probability distribution of the location of the sources is considered to estimate the actual source positions. However, these algorithms assume a given number of sound sources. This paper describes an updated research account on the solution presented in Escolano et al. [J. Acoust. Am. Soc. 132(3), 1257-1260 (2012)], where nested sampling is used to explore a probability distribution of the source position using a Laplacian mixture model, which allows both the number and position of speech sources to be inferred. This paper presents different experimental setups and scenarios to demonstrate the viability of the proposed method, which is compared with some of the most popular sampling methods, demonstrating that nested sampling is an accurate tool for speech localization. PMID:25234883
Separation of Radio-Frequency Sources and Localization of Partial Discharges in Noisy Environments
Directory of Open Access Journals (Sweden)
Guillermo Robles
2015-04-01
Full Text Available The detection of partial discharges (PD can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD. Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO to minimize a distance function.
Carabias-Orti, Julio J.; Cobos, Máximo; Vera-Candeas, Pedro; Rodríguez-Serrano, Francisco J.
2013-12-01
Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge amount of prior information available in this scenario. In fact, besides the special properties of close-microphone tracks, the knowledge on the number and type of instruments making up the mixture can also be successfully exploited for improved separation performance. In this paper, a nonnegative matrix factorization (NMF) method making use of all the above information is proposed. To this end, a set of instrument models are learnt from a training database and incorporated into a multichannel extension of the NMF algorithm. Several options to initialize the algorithm are suggested, exploring their performance in multiple music tracks and comparing the results to other state-of-the-art approaches.
Directory of Open Access Journals (Sweden)
Valeriy Bekmuradov
2014-10-01
Full Text Available Production of biofuel such as ethanol from lignocellulosic biomass is a beneficial way to meet sustainability and energy security in the future. The main challenge in bioethanol conversion is the high cost of processing, in which enzymatic hydrolysis and fermentation are the major steps. Among the strategies to lower processing costs are utilizing both glucose and xylose sugars present in biomass for conversion. An approach featuring enzymatic hydrolysis and fermentation steps, identified as separate hydrolysis and fermentation (SHF was used in this work. Proposed solution is to use "pre-processing" technologies, including the thermal screw press (TSP and cellulose-organic-solvent based lignocellulose fractionation (COSLIF pretreatments. Such treatments were conducted on a widely available feedstock such as source separated organic waste (SSO to liberate all sugars to be used in the fermentation process. Enzymatic hydrolysis was featured with addition of commercial available enzyme, Accellerase 1500, to mediate enzymatic hydrolysis process. On average, the sugar yield from the TSP and COSLIF pretreatments followed by enzymatic hydrolysis was remarkable at 90%. In this work, evaluation of the SSO hydrolysate obtained from COSLIF and enzymatic hydrolysis pretreaments on ethanol yields was compared by fermentation results with two different recombinant strains: Zymomonas mobilis 8b and Saccharomyces cerevisiae DA2416. At 48 hours of fermentation, ethanol yield was equivalent to 0.48g of ethanol produced per gram of SSO biomass by Z.mobilis 8b and 0.50g of ethanol produced per gram of SSO biomass by S. cerevisiae DA2416. This study provides important insights for investigation of the source-separated organic (SSO waste on ethanol production by different strains and becomes a useful tool to facilitate future process optimization for pilot scale facilities.
Fate of pharmaceuticals in full-scale source separated sanitation system.
Butkovskyi, A; Hernandez Leal, L; Rijnaarts, H H M; Zeeman, G
2015-11-15
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen-limited autotrophic nitrification-denitrification in a rotating biological contactor and struvite precipitation. Grey water is treated in an aerobic activated sludge process. Concentration of 10 pharmaceuticals and 2 transformation products in black water ranged between low μg/l to low mg/l. Additionally, 5 pharmaceuticals were also present in grey water in low μg/l range. Pharmaceutical influent loads were distributed over two streams, i.e. diclofenac was present for 70% in grey water, while the other compounds were predominantly associated to black water. Removal in the UASB reactor fed with black water exceeded 70% for 9 pharmaceuticals out of the 12 detected, with only two pharmaceuticals removed by sorption to sludge. Ibuprofen and the transformation product of naproxen, desmethylnaproxen, were removed in the rotating biological contactor. In contrast, only paracetamol removal exceeded 90% in the grey water treatment system while removal of other 7 pharmaceuticals was below 40% or even negative. The efficiency of pharmaceutical removal in the source separated sanitation system was compared with removal in the conventional sewage treatment plants. Furthermore, effluent concentrations of black water and grey water treatment systems were compared with predicted no-effect concentrations to assess toxicity of the effluent. Concentrations of diclofenac, ibuprofen and oxazepam in both effluents were higher than predicted no-effect concentrations, indicating the necessity of post-treatment. Ciprofloxacin, metoprolol and propranolol were found in UASB sludge in μg/g range, while pharmaceutical concentrations in struvite did not exceed the detection limits. PMID:26364222
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.
Welesameil, Mengstab Tilahun
2012-01-01
Filtration performance of three different non-woven geo-textiles (i.e. polypropylene and jute wool) to highly concentrated source separated black wastewater influent was evaluated in laboratory scale, aiming to optimize treatment process as pretreatments.
Pires, Carlos A. L.; Ribeiro, Andreia F. S.
2016-04-01
We develop an expansion of space-distributed time series into statistically independent uncorrelated subspaces (statistical sources) of low-dimension and exhibiting enhanced non-Gaussian probability distributions with geometrically simple chosen shapes (projection pursuit rationale). The method relies upon a generalization of the principal component analysis that is optimal for Gaussian mixed signals and of the independent component analysis (ICA), optimized to split non-Gaussian scalar sources. The proposed method, supported by information theory concepts and methods, is the independent subspace analysis (ISA) that looks for multi-dimensional, intrinsically synergetic subspaces such as dyads (2D) and triads (3D), not separable by ICA. Basically, we optimize rotated variables maximizing certain nonlinear correlations (contrast functions) coming from the non-Gaussianity of the joint distribution. As a by-product, it provides nonlinear variable changes `unfolding' the subspaces into nearly Gaussian scalars of easier post-processing. Moreover, the new variables still work as nonlinear data exploratory indices of the non-Gaussian variability of the analysed climatic and geophysical fields. The method (ISA, followed by nonlinear unfolding) is tested into three datasets. The first one comes from the Lorenz'63 three-dimensional chaotic model, showing a clear separation into a non-Gaussian dyad plus an independent scalar. The second one is a mixture of propagating waves of random correlated phases in which the emergence of triadic wave resonances imprints a statistical signature in terms of a non-Gaussian non-separable triad. Finally the method is applied to the monthly variability of a high-dimensional quasi-geostrophic (QG) atmospheric model, applied to the Northern Hemispheric winter. We find that quite enhanced non-Gaussian dyads of parabolic shape, perform much better than the unrotated variables in which concerns the separation of the four model's centroid regimes
Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.
Wang, Avery Li-Chun
This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters
Optimizing source detector separation for an implantable perfusion and oxygenation sensor
Akl, T. J.; King, T. J.; Long, R.; Baba, J. S.; McShane, M. J.; Ericson, M. N.; Wilson, M. A.; Coté, G. L.
2011-03-01
Each year thousands of patients are added to the waiting list for liver transplants. The first 7-10 days after transplant have proven to be the most critical in patient recovery and it is hypothesized that monitoring organ vital signals in this period can increase patient and graft survival rates. An implantable sensor to monitor the organ perfusion and oxygenation signals following surgery is being developed by our group. The sensor operates based on measuring diffuse reflection from three light emitting diodes (735, 805 and 940 nm). In this work the optimal source detector spacing to maximize oxygenation signal level is investigated for a portal vein model. Monte Carlo simulations provided signal levels and corresponding penetration depths as a function of separation between a point optical source and detector. The modeling results indicated a rapid decay in the optical signal with increasing distance. Through further analysis, it was found that there exists an optimal range of point source to detector spacing, between roughly 1 and 2 mm, in which the blood signal from the simulated portal vein was maximized. Overall, these results are being used to guide the placement and configuration of our probe for in vivo animal studies.
The Doppler Effect based acoustic source separation for a wayside train bearing monitoring system
Zhang, Haibin; Zhang, Shangbin; He, Qingbo; Kong, Fanrang
2016-01-01
Wayside acoustic condition monitoring and fault diagnosis for train bearings depend on acquired acoustic signals, which consist of mixed signals from different train bearings with obvious Doppler distortion as well as background noises. This study proposes a novel scheme to overcome the difficulties, especially the multi-source problem in wayside acoustic diagnosis system. In the method, a time-frequency data fusion (TFDF) strategy is applied to weaken the Heisenberg's uncertainty limit for a signal's time-frequency distribution (TFD) of high resolution. Due to the Doppler Effect, the signals from different bearings have different time centers even with the same frequency. A Doppler feature matching search (DFMS) algorithm is then put forward to locate the time centers of different bearings in the TFD spectrogram. With the determined time centers, time-frequency filters (TFF) are designed with thresholds to separate the acoustic signals in the time-frequency domain. Then the inverse STFT (ISTFT) is taken and the signals are recovered and filtered aiming at each sound source. Subsequently, a dynamical resampling method is utilized to remove the Doppler Effect. Finally, accurate diagnosis for train bearing faults can be achieved by applying conventional spectrum analysis techniques to the resampled data. The performance of the proposed method is verified by both simulated and experimental cases. It shows that it is effective to detect and diagnose multiple defective bearings even though they produce multi-source acoustic signals.
Tungkasthan, Anucha; Jongsawat, Nipat; Poompuang, Pittaya; Intarasema, Sarayut; Premchaiswadi, Wichian
2010-01-01
This paper presented a practical framework for automating the building of diagnostic BN models from data sources obtained from the WWW and demonstrates the use of a SMILE web-based interface to represent them. The framework consists of the following components: RSS agent, transformation/conversion tool, core reasoning engine, and the SMILE web-based interface. The RSS agent automatically collects and reads the provided RSS feeds according to the agent's predefined URLs. A transformation/conve...
Ichimaru, Satoshi; Hatayama, Masatoshi; Ohchi, Tadayuki; Gullikson, Eric M; Oku, Satoshi
2016-02-10
We describe the design and fabrication of a ruthenium beam separator used to simultaneously attenuate infrared light and reflect soft x rays. Measurements in the infrared and soft x-ray regions showed the beam separator to have a reflectivity of 50%-85% in the wavelength region from 6 to 10 nm at a grazing incidence angle of 7.5 deg and 4.3% at 800 nm and the same angle of grazing incidence, indicating that the amount of attenuation is 0.05-0.09. These results show that this beam separator could provide an effective means for separating IR light from soft x rays in light generated by high-order harmonic generation sources. PMID:26906363
Sukholthaman, Pitchayanin; Sharp, Alice
2016-06-01
Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed. PMID:27026497
DEFF Research Database (Denmark)
Oh, Geok Lian; Brunskog, Jonas
2014-01-01
frequency-wavenumber processing to determine the location of the underground tunnel. Considering the case of determining the location of an underground tunnel, this paper proposed two physical models, the acoustic approximation ray tracing model and the finite difference time domain three-dimensional (3D......Techniques have been studied for the localization of an underground source with seismic interrogation signals. Much of the work has involved defining either a P-wave acoustic model or a dispersive surface wave model to the received signal and applying the time-delay processing technique and...
Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms
Directory of Open Access Journals (Sweden)
Pan Qiongfeng
2007-01-01
Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.
Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms
Directory of Open Access Journals (Sweden)
Qiongfeng Pan
2007-04-01
Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.
Non-contact time-resolved diffuse reflectance imaging at null source-detector separation.
Mazurenka, M; Jelzow, A; Wabnitz, H; Contini, D; Spinelli, L; Pifferi, A; Cubeddu, R; Mora, A Dalla; Tosi, A; Zappa, F; Macdonald, R
2012-01-01
We report results of the proof-of-principle tests of a novel non-contact tissue imaging system. The system utilizes a quasi-null source-detector separation approach for time-domain near-infrared spectroscopy, taking advantage of an innovative state-of-the-art fast-gated single photon counting detector. Measurements on phantoms demonstrate the feasibility of the non-contact approach for the detection of optically absorbing perturbations buried up to a few centimeters beneath the surface of a tissue-like turbid medium. The measured depth sensitivity and spatial resolution of the new system are close to the values predicted by Monte Carlo simulations for the inhomogeneous medium and an ideal fast-gated detector, thus proving the feasibility of the non-contact approach for high density diffuse reflectance measurements on tissue. Potential applications of the system are also discussed. PMID:22274351
Designing of a lead ion model source for plasma separation of spent nuclear fuel
Antonov, N. N.; Vorona, N. A.; Gavrikov, A. V.; Samokhin, A. A.; Smirnov, V. P.
2016-02-01
Plasma sources of model substances are required for solving problems associated with the development of a plasma separation method for spent nuclear fuel (SNF). Lead is chosen as the substance simulating the kinetics and dynamics of the heavy SNF component. We report on the results of analysis of the discharge in lead vapor with a concentration of 1012-1013 cm-3. Ionization is produced by an electron beam (with electron energy up to 500 eV) in the centimeter gap between planar electrodes. The discharge is simulated using the hydrodynamic and one-particle approximations. The current-voltage characteristics and efficiencies of single ionization depending on the vapor concentrations and thermoelectron current are obtained. The experimentally determined ion currents on the order of 100 μA for an ionization efficiency on the order of 0.1% are in conformity with the result of simulation.
Saline sewage treatment and source separation of urine for more sustainable urban water management.
Ekama, G A; Wilsenach, J A; Chen, G H
2011-01-01
While energy consumption and its associated carbon emission should be minimized in wastewater treatment, it has a much lower priority than human and environmental health, which are both closely related to efficient water quality management. So conservation of surface water quality and quantity are more important for sustainable development than green house gas (GHG) emissions per se. In this paper, two urban water management strategies to conserve fresh water quality and quantity are considered: (1) source separation of urine for improved water quality and (2) saline (e.g. sea) water toilet flushing for reduced fresh water consumption in coastal and mining cities. The former holds promise for simpler and shorter sludge age activated sludge wastewater treatment plants (no nitrification and denitrification), nutrient (Mg, K, P) recovery and improved effluent quality (reduced endocrine disruptor and environmental oestrogen concentrations) and the latter for significantly reduced fresh water consumption, sludge production and oxygen demand (through using anaerobic bioprocesses) and hence energy consumption. Combining source separation of urine and saline water toilet flushing can reduce sewer crown corrosion and reduce effluent P concentrations. To realize the advantages of these two approaches will require significant urban water management changes in that both need dual (fresh and saline) water distribution and (yellow and grey/brown) wastewater collection systems. While considerable work is still required to evaluate these new approaches and quantify their advantages and disadvantages, it would appear that the investment for dual water distribution and wastewater collection systems may be worth making to unlock their benefits for more sustainable urban development. PMID:22214085
System identification through nonstationary data using Time-Frequency Blind Source Separation
Guo, Yanlin; Kareem, Ahsan
2016-06-01
Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the
Directory of Open Access Journals (Sweden)
Hongyun Han
2016-07-01
Full Text Available This paper examines how and to what degree government policies of garbage fees and voluntary source separation programs, with free indoor containers and garbage bags, can affect the effectiveness of municipal solid waste (MSW management, in the sense of achieving a desirable reduction of per capita MSW generation. Based on city-level panel data for years 1998–2012 in China, our empirical analysis indicates that per capita MSW generated is increasing with per capita disposable income, average household size, education levels of households, and the lagged per capita MSW. While both garbage fees and source separation programs have separately led to reductions in per capita waste generation, the interaction of the two policies has resulted in an increase in per capita waste generation due to the following crowding-out effects: Firstly, the positive effect of income dominates the negative effect of the garbage fee. Secondly, there are crowding-out effects of mandatory charging system and the subsidized voluntary source separation on per capita MSW generation. Thirdly, small subsidies and tax punishments have reduced the intrinsic motivation for voluntary source separation of MSW. Thus, compatible fee charging system, higher levels of subsidies, and well-designed public information and education campaigns are required to promote household waste source separation and reduction.
Jaatinen, Sanna; Kivistö, Anniina; Palmroth, Marja R T; Karp, Matti
2016-09-01
The objective was to demonstrate that a microbial whole cell biosensor, bioluminescent yeast, Saccharomyces cerevisiae (BMAEREluc/ERα) can be applied to detect overall estrogenic activity from fresh and stored human urine. The use of source-separated urine in agriculture removes a human originated estrogen source from wastewater influents, subsequently enabling nutrient recycling. Estrogenic activity in urine should be diminished prior to urine usage in agriculture in order to prevent its migration to soil. A storage period of 6 months is required for hygienic reasons; therefore, estrogenic activity monitoring is of interest. The method measured cumulative female hormone-like activity. Calibration curves were prepared for estrone, 17β-estradiol, 17α- ethinylestradiol and estriol. Estrogen concentrations of 0.29-29,640 μg L(-1) were detectable while limit of detection corresponded to 0.28-35 μg L(-1) of estrogens. The yeast sensor responded well to fresh and stored urine and gave high signals corresponding to 0.38-3,804 μg L(-1) of estrogens in different urine samples. Estrogenic activity decreased during storage, but was still higher than in fresh urine implying insufficient storage length. The biosensor was suitable for monitoring hormonal activity in urine and can be used in screening anthropogenic estrogen-like compounds interacting with the receptor.
Jaatinen, Sanna; Kivistö, Anniina; Palmroth, Marja R T; Karp, Matti
2016-09-01
The objective was to demonstrate that a microbial whole cell biosensor, bioluminescent yeast, Saccharomyces cerevisiae (BMAEREluc/ERα) can be applied to detect overall estrogenic activity from fresh and stored human urine. The use of source-separated urine in agriculture removes a human originated estrogen source from wastewater influents, subsequently enabling nutrient recycling. Estrogenic activity in urine should be diminished prior to urine usage in agriculture in order to prevent its migration to soil. A storage period of 6 months is required for hygienic reasons; therefore, estrogenic activity monitoring is of interest. The method measured cumulative female hormone-like activity. Calibration curves were prepared for estrone, 17β-estradiol, 17α- ethinylestradiol and estriol. Estrogen concentrations of 0.29-29,640 μg L(-1) were detectable while limit of detection corresponded to 0.28-35 μg L(-1) of estrogens. The yeast sensor responded well to fresh and stored urine and gave high signals corresponding to 0.38-3,804 μg L(-1) of estrogens in different urine samples. Estrogenic activity decreased during storage, but was still higher than in fresh urine implying insufficient storage length. The biosensor was suitable for monitoring hormonal activity in urine and can be used in screening anthropogenic estrogen-like compounds interacting with the receptor. PMID:26804108
Detection of sudden structural damage using blind source separation and time-frequency approaches
Morovati, V.; Kazemi, M. T.
2016-05-01
Seismic signal processing is one of the most reliable methods of detecting the structural damage during earthquakes. In this paper, the use of the hybrid method of blind source separation (BSS) and time-frequency analysis (TFA) is explored to detect the changes in the structural response data. The combination of the BSS and TFA is applied to the seismic signals due to the non-stationary nature of them. Firstly, the second-order blind identification technique is used to decompose the response signal of structural vibration into modal coordinate signals which will be mono-components for TFA. Then each mono-component signal is analyzed to extract instantaneous frequency of structure. Numerical simulations and a real-world seismic-excited structure with time-varying frequencies show the accuracy and robustness of the developed algorithm. TFA of extracted sources shows that used method can be successfully applied to structural damage detection. The results also demonstrate that the combined method can be used to identify the time instant of structural damage occurrence more sharply and effectively than by the use of TFA alone.
The source regions of solar energetic particles detected by widely separated spacecraft
Energy Technology Data Exchange (ETDEWEB)
Park, Jinhye; Moon, Y.-J. [School of Space Research, Kyung Hee University, Yongin 446-701 (Korea, Republic of); Innes, D. E.; Bucik, R., E-mail: jinhye@khu.ac.kr [Max Planck Institute for Solar System Research, D-37191 Katlenburg-Lindau (Germany)
2013-12-20
We studied the source regions of 12 solar energetic particle (SEP) events seen between 2010 August and 2012 January at STEREO-A, B, and/or Earth (Advanced Composition Explorer/Solar and Heliospheric Observatory/GOES), when the two STEREO spacecraft were separated by about 180°. All events were associated with flares (C1 to X6) and fast coronal mass ejections and, except for one, accompanied by type II radio bursts. We have determined the arrival times of the SEPs at the three positions. Extreme ultraviolet (EUV) waves, observed in the 195 Å and 193 Å channels of STEREO and the Solar Dynamics Observatory, are tracked across the Sun to determine their arrival time at the photospheric source of open field lines connecting to the spacecraft. There is a good correlation between the EUV wave arrival times at the connecting footpoints and the SEP onset times. The delay time between electron onset and the EUV wave reaching the connecting footpoint is independent of distance from the flare site. The proton delay time increases with distance from the flare site. In three of the events, secondary flare sites may have also contributed to the wide longitudinal spread of SEPs.
Bayesian stable isotope mixing models
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...
Introduction to Bayesian statistics
Bolstad, William M
2016-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Wright, L.; Coddington, O.; Pilewskie, P.
2015-12-01
Current challenges in Earth remote sensing require improved instrument spectral resolution, spectral coverage, and radiometric accuracy. Hyperspectral instruments, deployed on both aircraft and spacecraft, are a growing class of Earth observing sensors designed to meet these challenges. They collect large amounts of spectral data, allowing thorough characterization of both atmospheric and surface properties. The higher accuracy and increased spectral and spatial resolutions of new imagers require new numerical approaches for processing imagery and separating surface and atmospheric signals. One potential approach is source separation, which allows us to determine the underlying physical causes of observed changes. Improved signal separation will allow hyperspectral instruments to better address key science questions relevant to climate change, including land-use changes, trends in clouds and atmospheric water vapor, and aerosol characteristics. In this work, we investigate a Non-negative Matrix Factorization (NMF) method for the separation of atmospheric and land surface signal sources. NMF offers marked benefits over other commonly employed techniques, including non-negativity, which avoids physically impossible results, and adaptability, which allows the method to be tailored to hyperspectral source separation. We adapt our NMF algorithm to distinguish between contributions from different physically distinct sources by introducing constraints on spectral and spatial variability and by using library spectra to inform separation. We evaluate our NMF algorithm with simulated hyperspectral images as well as hyperspectral imagery from several instruments including, the NASA Airborne Visible/Infrared Imaging Spectrometer (AVIRIS), NASA Hyperspectral Imager for the Coastal Ocean (HICO) and National Ecological Observatory Network (NEON) Imaging Spectrometer.
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
Bayesian Stable Isotope Mixing Models
Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard
2012-01-01
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...
Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface
Directory of Open Access Journals (Sweden)
Michael H. Thaut
2005-11-01
Full Text Available Most EEG-based BCI systems make use of well-studied patterns of brain activity. However, those systems involve tasks that indirectly map to simple binary commands such as Ã¢Â€ÂœyesÃ¢Â€Â or Ã¢Â€ÂœnoÃ¢Â€Â or require many weeks of biofeedback training. We hypothesized that signal processing and machine learning methods can be used to discriminate EEG in a direct Ã¢Â€ÂœyesÃ¢Â€Â/Ã¢Â€ÂœnoÃ¢Â€Â BCI from a single session. Blind source separation (BSS and spectral transformations of the EEG produced a 180-dimensional feature space. We used a modified genetic algorithm (GA wrapped around a support vector machine (SVM classifier to search the space of feature subsets. The GA-based search found feature subsets that outperform full feature sets and random feature subsets. Also, BSS transformations of the EEG outperformed the original time series, particularly in conjunction with a subset search of both spaces. The results suggest that BSS and feature selection can be used to improve the performance of even a Ã¢Â€Âœdirect,Ã¢Â€Â single-session BCI.
High acceptance of urine source separation in seven European countries: a review.
Lienert, Judit; Larsen, Tove A
2010-01-15
Urine source separation (NoMix-technology) is a promising innovation aiming at a resource-oriented, decentralized approach in urban water management. However, NoMix-technology has a sensitive end-point: people's bathrooms. NoMix-technology is increasingly applied in European pilot projects, but the success from a user point-of-view has rarely been systematically monitored. We aim at closing this gap. We review surveys on acceptance, including reuse of human urine as fertilizer, from 38 NoMix-projects in 7 Northern and Central European countries with 2700 respondents. Additionally, we identify explanatory variables with logistic regression of a representative Swiss library survey. NoMix-technology is well accepted; around 80% of users liked the idea, 75-85% were satisfied with design, hygiene, smell, and seating comfort of NoMix-toilets, 85% regarded urine-fertilizers as good idea (50% of farmers), and 70% would purchase such food. However, 60% of users encountered problems; NoMix-toilets need further development. We found few differences among countries, but systematic differences between public and private settings, where people seem more critical. Information was positively correlated with acceptance, and, e.g., a good mood or environmentally friendly behavior. For future success of NoMix-projects, we recommend authorities follow an integral strategy. Lay people will then find the NoMix-concept appealing and support this promising bathroom innovation. PMID:20000706
Haraldsen, Trond Knapp; Andersen, Uno; Krogstad, Tore; Sørheim, Roald
2011-12-01
This study examined the efficiency of different organic waste materials as NPK fertilizer, in addition to the risk for leaching losses related to shower precipitation in the first part of the growing season. The experiment was tested in a pot trial on a sandy soil in a greenhouse. Six organic fertilizers were evaluated: liquid anaerobic digestate (LAD) sourced from separated household waste, nitrified liquid anaerobic digestate (NLAD) of the same origin as LAD, meat and bone meal (MBM), hydrolysed salmon protein (HSP), reactor-composted catering waste (CW) and cattle manure (CM). An unfertilized control, calcium nitrate (CN) and Fullgjødsel® 21-4-10 were used as reference fertilizers. At equal amounts of mineral nitrogen both LAD and Fullgjødsel® gave equal yield of barley in addition to equal uptake of N, P, and K in barley grain. NLAD gave significantly lower barley yield than the original LAD due to leaching of nitrate-N after a simulated surplus of precipitation (28 mm) at Zadoks 14. There was significantly increased leaching of nitrate N from the treatments receiving 160 kg N ha(-1) of CN and NLAD in comparison with all the other organic fertilizers. In this study LAD performed to the same degree as Fullgjødsel® NPK fertilizer and it was concluded that LAD can be recommended as fertilizer for cereals. Nitrification of the ammonium N in the digestate caused significantly increased nitrate leaching, and cannot be recommended.
Insects associated with the composting process of solid urban waste separated at the source
Directory of Open Access Journals (Sweden)
Gladis Estela Morales
2010-01-01
Full Text Available Sarcosaprophagous macroinvertebrates (earthworms, termites and a number of Diptera larvae enhance changes in the physical and chemical properties of organic matter during degradation and stabilization processes in composting, causing a decrease in the molecular weights of compounds. This activity makes these organisms excellent recyclers of organic matter. This article evaluates the succession of insects associated with the decomposition of solid urban waste separated at the source. The study was carried out in the city of Medellin, Colombia. A total of 11,732 individuals were determined, belonging to the classes Insecta and Arachnida. Species of three orders of Insecta were identified, Diptera, Coleoptera and Hymenoptera. Diptera corresponding to 98.5% of the total, was the most abundant and diverse group, with 16 families (Calliphoridae, Drosophilidae, Psychodidae, Fanniidae, Muscidae, Milichiidae, Ulidiidae, Scatopsidae, Sepsidae, Sphaeroceridae, Heleomyzidae, Stratiomyidae, Syrphidae, Phoridae, Tephritidae and Curtonotidae followed by Coleoptera with five families (Carabidae, Staphylinidae, Ptiliidae, Hydrophilidae and Phalacaridae. Three stages were observed during the composting process, allowing species associated with each stage to be identified. Other species were also present throughout the whole process. In terms of number of species, Diptera was the most important group observed, particularly Ornidia obesa, considered a highly invasive species, and Hermetia illuscens, both reported as beneficial for decomposition of organic matter.
Directory of Open Access Journals (Sweden)
Farkhan Rosi
2013-09-01
Full Text Available Pada sistem komunikasi bawah air, seringkali sinyal yang diterima oleh sensor berasal dari hasil pencampuran sumber sinyal dengan sinyal-sinyal akustik lain di lingkungan bawah air. Hal ini menjadikan sinyal yang didapatkan menjadi tidak sesuai dengan yang diinginkan. Teknik Blind Source Separation (BSS dipakai di sini untuk memisahkan sinyal-sinyal yang bercampur tersebut. Dalam tugas akhir ini, dilakukan pemisahan sinyal akustik dengan menggunakan Natural Gradient ICA berdasarkan Generalized Gaussian Model yang didapat dari karakteristik distribusi sumber sinyal akustik non-gaussian yakni ship radiated noise dan sea ambient noise. Pemisahan sinyal akustik dilakukan sebanyak tiga kali yakni dengan simulasi, toolbox ICALABS V3, dan menggunakan pemisahan sinyal akustik dari data riil pengukuran. Dari hasil simulasi menunjukkan pemisahan dengan algoritma Natural Gradien ICA berdasarkan Generalized Gaussian Model berjalan dengan baik. Hal ini ditunjukkan dengan nilai SIR shrimp.wav = 48.9946 dB dan ferry.wav = 46.9309. dB. Sedangkan rata-rata MSE shrimp.wav = 1.2605 x 10-5 dan ferry.wav = 2.0272 x 10 -5.
Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis
Directory of Open Access Journals (Sweden)
Samuel Akwei-Sekyere
2015-07-01
Full Text Available The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio.
Musafere, F.; Sadhu, A.; Liu, K.
2016-01-01
In the last few decades, structural health monitoring (SHM) has been an indispensable subject in the field of vibration engineering. With the aid of modern sensing technology, SHM has garnered significant attention towards diagnosis and risk management of large-scale civil structures and mechanical systems. In SHM, system identification is one of major building blocks through which unknown system parameters are extracted from vibration data of the structures. Such system information is then utilized to detect the damage instant, and its severity to rehabilitate and prolong the existing health of the structures. In recent years, blind source separation (BSS) algorithm has become one of the newly emerging advanced signal processing techniques for output-only system identification of civil structures. In this paper, a novel damage detection technique is proposed by integrating BSS with the time-varying auto-regressive modeling to identify the instant and severity of damage. The proposed method is validated using a suite of numerical studies and experimental models followed by a full-scale structure.
Edwards, Joel; Othman, Maazuza; Burn, Stewart; Crossin, Enda
2016-10-01
The collection of source separated kerbside municipal FW (SSFW) is being incentivised in Australia, however such a collection is likely to increase the fuel and time a collection truck fleet requires. Therefore, waste managers need to determine whether the incentives outweigh the cost. With literature scarcely describing the magnitude of increase, and local parameters playing a crucial role in accurately modelling kerbside collection; this paper develops a new general mathematical model that predicts the energy and time requirements of a collection regime whilst incorporating the unique variables of different jurisdictions. The model, Municipal solid waste collect (MSW-Collect), is validated and shown to be more accurate at predicting fuel consumption and trucks required than other common collection models. When predicting changes incurred for five different SSFW collection scenarios, results show that SSFW scenarios require an increase in fuel ranging from 1.38% to 57.59%. There is also a need for additional trucks across most SSFW scenarios tested. All SSFW scenarios are ranked and analysed in regards to fuel consumption; sensitivity analysis is conducted to test key assumptions.
Sun, Y; Finlayson-Pitts, B J; Xin, J
2011-01-01
Differential optical absorption spectroscopy (DOAS) is a powerful tool for detecting and quantifying trace gases in atmospheric chemistry \\cite{Platt_Stutz08}. DOAS spectra consist of a linear combination of complex multi-peak multi-scale structures. Most DOAS analysis routines in use today are based on least squares techniques, for example, the approach developed in the 1970s uses polynomial fits to remove a slowly varying background, and known reference spectra to retrieve the identity and concentrations of reference gases. An open problem is to identify unknown gases in the fitting residuals for complex atmospheric mixtures. In this work, we develop a novel three step semi-blind source separation method. The first step uses a multi-resolution analysis to remove the slow-varying and fast-varying components in the DOAS spectral data matrix $X$. The second step decomposes the preprocessed data $\\hat{X}$ in the first step into a linear combination of the reference spectra plus a remainder, or $\\hat{X} = A\\,S +...
Fehr, M
2014-09-01
Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills.
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Institute of Scientific and Technical Information of China (English)
AaziTakpaya; WeiGang
2003-01-01
Blind identification-blind equalization for finite Impulse Response(FIR)Multiple Input-Multiple Output(MIMO)channels can be reformulated as the problem of blind sources separation.It has been shown that blind identification via decorrelating sub-channels method could recover the input sources.The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators,which decorrelate the output signals of subchannels,and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix.In this paper,a new qpproximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed.The proposed method outperforms BIDS in the presence of additive white Garssian noise.
Institute of Scientific and Technical Information of China (English)
Kazi Takpaya; Wei Gang
2003-01-01
Blind identification-blind equalization for Finite Impulse Response (FIR) Multiple Input-Multiple Output (MIMO) channels can be reformulated as the problem of blind sources separation. It has been shown that blind identification via decorrelating sub-channels method could recover the input sources. The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators, which decorrelate the output signals of subchannels, and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix. In this paper, a new approximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed. The proposed method outperforms BIDS in the presence of additive white Gaussian noise.
Deep Transform: Cocktail Party Source Separation via Probabilistic Re-Synthesis
Simpson, Andrew J. R.
2015-01-01
In cocktail party listening scenarios, the human brain is able to separate competing speech signals. However, the signal processing implemented by the brain to perform cocktail party listening is not well understood. Here, we trained two separate convolutive autoencoder deep neural networks (DNN) to separate monaural and binaural mixtures of two concurrent speech streams. We then used these DNNs as convolutive deep transform (CDT) devices to perform probabilistic re-synthesis. The CDTs operat...
Wu, Weiliang; Lin, Tian Ran; Tan, Andy C. C.
2015-12-01
A signal processing technique is presented in this paper to normalize and separate the source of non-linear acoustic emission (AE) signals of a multi-cylinder diesel engine for condition monitoring applications and fault detection. The normalization technique presented in the paper overcomes the long-existing non-linearity problem of AE sensors so that responses measured by different AE sensors can be quantitatively analysed and compared. A source separation algorithm is also developed in the paper to separate the mixture of the normalized AE signals produced by a multi-cylinder diesel engine by utilising the system parameters (i.e., wave attenuation constant and the arrival time delay) of AE wave propagation determined by a standard pencil lead break test on the engine cylinder head. It is shown that the source separation algorithm is able to separate the signal interference of adjacent cylinders from the monitored cylinder once the wave attenuation constant and the arrival time delay along the propagation path are known. The algorithm is particularly useful in the application of AE technique for condition monitoring of small-size diesel engines where signal interference from the neighbouring cylinders is strong.
DEFF Research Database (Denmark)
Hansen, Trine Lund; Svärd, Å; Angelidaki, Irini;
2003-01-01
A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical...... composition of the wastes and the estimated methane potentials....
Zeeman, G.; Kujawa, K.; Mes, de T.Z.D.; Graaff, de M.S.; Abu-Ghunmi, L.N.A.H.; Mels, A.R.; Meulman, B.; Temmink, B.G.; Buisman, C.J.N.; Lier, van J.B.; Lettinga, G.
2008-01-01
Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas product
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
Abban, B. K.; Papanicolaou, T.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schnoebelen, D. J.; Rhoads, B. L.; Yu, M.
2015-12-01
We apply an enhanced revision of a Bayesian un-mixing framework for estimating sediment source contributions to an intensively managed watershed in the US Midwest that is characterized by spatiotemporal heterogeneity. The framework has two key parameters, namely a and b, that account for spatial origin attributes and the time history of sediments delivered to the watershed outlet, respectively. The probabilistic treatment of these parameters incorporates variability in source erosion processes, as well as the delivery times and storage of eroded material within the watershed. Uncertainty in source contribution estimates in our approach is quantified naturally through the use of Markov Chain Monte Carlo simulations for estimating the posterior probability density functions. The studied watershed is the 26 km² South Amana Sub-Watershed located within the Clear Creek Watershed (CCW), IA, which is part of the Critical Zone Observatory for Intensively Managed Landscapes (IML-CZO). Utilizing stable isotopes of C and N, the Bayesian framework predicted trends in mean source contributions and uncertainty that were in agreement with observations from other studies. Terrestrial sources were found to dominate sediment contributions in the earlier stages of the growing season when land cover was small and the hydrologic forcing was large. Instream sources became more dominate during the latter stages when land cover was well-established and more extensive. Also, the effects of spatial heterogeneity and sediment travel time and delivery on uncertainty in sources contribution estimates were adequately captured. The results clearly showed a reduction in uncertainty when watershed connectivity was greatest and considerable amounts of material were delivered to the collection point at the outlet. On-going application of the framework to the Upper Sangamon River Basin (USRB), which also part of the IML-CZO and has distinct features from CCW, is expected to shed more light on the
Directory of Open Access Journals (Sweden)
Saswati Swapna Dash
2014-07-01
Full Text Available This paper presents an overall study of Feedback Control of Z-Source Converter Fed Separately excited DC motor with centrifugal Pump Set. Z-source converter can be used for both voltage buck and boost mode using LC impedance network. In this paper the dynamic modeling of Z-source with motor load and centrifugal pump set is carried out with new findings. The compensators for speed feedback loop are designed by taking average state space analysis and small signal model of the system. The feedback loop is designed by classical control methods. The experiment is done in MATLAB work environment and the result is verified by Simulation.
Saswati Swapna Dash; Byamakesh Nayak; Subrat Kumar
2014-01-01
This paper presents an overall study of Feedback Control of Z-Source Converter Fed Separately excited DC motor with centrifugal Pump Set. Z-source converter can be used for both voltage buck and boost mode using LC impedance network. In this paper the dynamic modeling of Z-source with motor load and centrifugal pump set is carried out with new findings. The compensators for speed feedback loop are designed by taking average state space analysis and small signal model of the system. The feedba...
Directory of Open Access Journals (Sweden)
He Peng Ju
2016-01-01
Full Text Available Nowadays, detecting fetal ECG using abdominal signal is a commonly used method, but fetal ECG signal will be affected by maternal ECG. Current FECG extraction algorithms are mainly aiming at multiple channels signal. They often assume there is only one fetus and did not consider multiple births. This paper proposed a single channel blind source separation (SCBSS algorithm based on source number estimation using multi-algorithm fusion to process single abdominal signal. The method decomposed collected single channel signal into multiple intrinsic mode function (IMF utilizing Empirical Mode Decomposition (EMD, mapping single channel into multiple channels. Four multiple channel source number estimation (MCSNE methods (Bootstrap, Hough, AIC and PCA were weighting fused to estimate accurate source number and the particle swarm optimization algorithm (PSO was employed to determine weighted coefficient. According to source number and IMF, nonnegative matrix was constructed and nonnegative matrix factorization (NMF was employed to separate mixed signals. Experiments used single channel signal mixed by four man-made signals and single channel ECG mixed by two to verify the proposed algorithm. Results showed that the proposed algorithm could determine number of independent signal in single acquired signal. FECG could be extracted from single channel observed signal and the algorithm can be used to solve separation of MECG and FECG.
Yang, Yang; Li, Xiukun
2016-06-01
Separation of the components of rigid acoustic scattering by underwater objects is essential in obtaining the structural characteristics of such objects. To overcome the problem of rigid structures appearing to have the same spectral structure in the time domain, time-frequency Blind Source Separation (BSS) can be used in combination with image morphology to separate the rigid scattering components of different objects. Based on a highlight model, the separation of the rigid scattering structure of objects with time-frequency distribution is deduced. Using a morphological filter, different characteristics in a Wigner-Ville Distribution (WVD) observed for single auto term and cross terms can be simplified to remove any cross-term interference. By selecting time and frequency points of the auto terms signal, the accuracy of BSS can be improved. An experimental simulation has been used, with changes in the pulse width of the transmitted signal, the relative amplitude and the time delay parameter, in order to analyzing the feasibility of this new method. Simulation results show that the new method is not only able to separate rigid scattering components, but can also separate the components when elastic scattering and rigid scattering exist at the same time. Experimental results confirm that the new method can be used in separating the rigid scattering structure of underwater objects.
Single channel source separation of radar fuze mixed signal based on phase difference analysis
Institute of Scientific and Technical Information of China (English)
Hang ZHU; Shu-ning ZHANG; Hui-chang ZHAO
2014-01-01
A new method based on phase difference analysis is proposed for the single-channel mixed signal separation of single-channel radar fuze. This method is used to estimate the mixing coefficients of de-noised signals through the cumulants of mixed signals, solve the candidate data set by the mixing coefficients and signal analytical form, and resolve the problem of vector ambiguity by analyzing the phase differences. The signal separation is realized by exchanging data of the solutions. The waveform similarity coefficients are calculated, and the timeefrequency dis-tributions of separated signals are analyzed. The results show that the proposed method is effective.
Anaerobic treatment in decentralised and source-separation-based sanitation concepts
Kujawa-Roeleveld, K.; Zeeman, G.
2006-01-01
Anaerobic digestion of wastewater should be a core technology employed in decentralised sanitation systems especially when their objective is also resource conservation and reuse. The most efficient system involves separate collection and anaerobic digestion of the most concentrated domestic wastewa
Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation
DEFF Research Database (Denmark)
Schmidt, Mikkel N.; Mørup, Morten
2006-01-01
We present a novel method for blind separation of instruments in polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding to...... individual instruments. Based on this factorization we separate the instruments using spectrogram masking. The proposed algorithm has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription....
DEFF Research Database (Denmark)
Fernandez Grande, Efren; Jacobsen, Finn
2010-01-01
A method of estimating the sound field radiated by a source under non-anechoic conditions has been examined. The method uses near field acoustic holography based on a combination of pressure and particle velocity measurements in a plane near the source for separating outgoing and ingoing wave...... components. The outgoing part of the sound field is composed of both radiated and scattered waves. The method compensates for the scattered components of the outgoing field on the basis of the boundary condition of the problem, exploiting the fact that the sound field is reconstructed very close...... to the source. Thus the radiated free-field component is estimated simultaneously with solving the inverse problem of reconstructing the sound field near the source. The method is particularly suited to cases in which the overall contribution of reflected sound in the measurement plane is significant....
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Larsen, Bjarne;
2016-01-01
A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed...... screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered......) to the produced biomass. The data generated in this study could be used for the environmental assessment of the technology and thus help in selecting the best pre-treatment technology for source separated organic household waste....
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study
Tervahauta, T.H.; Trang Hoang,; Hernández, L.; Zeeman, G.; Buisman, C.J.N.
2013-01-01
Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data
Bai, Mingsian R; Yao, Yueh Hua; Lai, Chang-Sheng; Lo, Yi-Yang
2016-03-01
In this paper, four delay-and-sum (DAS) beamformers formulated in the modal domain and the space domain for open and solid spherical apertures are examined through numerical simulations. The resulting beampatterns reveal that the mainlobe of the solid spherical DAS array is only slightly narrower than that of the open array, whereas the sidelobes of the modal domain array are more significant than those of the space domain array due to the discrete approximation of continuous spherical Fourier transformation. To verify the theory experimentally, a three-dimensionally printed spherical array on which 32 micro-electro-mechanical system microphones are mounted is utilized for localization and separation of sound sources. To overcome the basis mismatch problem in signal separation, source localization is first carried out using minimum variance distortionless response beamformer. Next, Tikhonov regularization (TIKR) and compressive sensing (CS) are employed to extract the source signal amplitudes. Simulations and experiments are conducted to validate the proposed spherical array system. Objective perceptual evaluation of speech quality test and a subjective listening test are undertaken in performance evaluation. The experimental results demonstrate better separation quality achieved by the CS approach than by the TIKR approach at the cost of computational complexity.
Bai, Mingsian R; Yao, Yueh Hua; Lai, Chang-Sheng; Lo, Yi-Yang
2016-03-01
In this paper, four delay-and-sum (DAS) beamformers formulated in the modal domain and the space domain for open and solid spherical apertures are examined through numerical simulations. The resulting beampatterns reveal that the mainlobe of the solid spherical DAS array is only slightly narrower than that of the open array, whereas the sidelobes of the modal domain array are more significant than those of the space domain array due to the discrete approximation of continuous spherical Fourier transformation. To verify the theory experimentally, a three-dimensionally printed spherical array on which 32 micro-electro-mechanical system microphones are mounted is utilized for localization and separation of sound sources. To overcome the basis mismatch problem in signal separation, source localization is first carried out using minimum variance distortionless response beamformer. Next, Tikhonov regularization (TIKR) and compressive sensing (CS) are employed to extract the source signal amplitudes. Simulations and experiments are conducted to validate the proposed spherical array system. Objective perceptual evaluation of speech quality test and a subjective listening test are undertaken in performance evaluation. The experimental results demonstrate better separation quality achieved by the CS approach than by the TIKR approach at the cost of computational complexity. PMID:27036243
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.
International Nuclear Information System (INIS)
An alkali-metal ion source working without a store of alkali-metals is described. The alkali-metal ions are produced by evaporation of alkali salts and ionization in a low-voltage arc discharge stabilized with a noble gas plasma or in the case of small alkali-metal ion currents on the base of the well known thermic ionization at a hot tungsten wire. The source is very simple in construction and produces a stable ion current of 0.3 μA for more than 100 h. It is possible to change the ion species in a short time. This source is applicable to all SIMS equipments using mass separation for primary ions. (author)
Frühwirth-Schnatter, Sylvia
1990-01-01
In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Time-domain beamforming and blind source separation speech input in the car environment
Bourgeois, Julien
2009-01-01
The development of computer and telecommunication technologies led to a revolutioninthewaythatpeopleworkandcommunicatewitheachother.One of the results is that large amount of information will increasingly be held in a form that is natural for users, as speech in natural language. In the presented work, we investigate the speech signal capture problem, which includes the separation of multiple interfering speakers using microphone arrays. Adaptive beamforming is a classical approach which has been developed since the seventies. However it requires a double-talk detector (DTD) that interrupts th
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
Directory of Open Access Journals (Sweden)
Kellermann Walter
2007-01-01
Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.
Degani, D.
1983-01-01
A numerical study of the conjugated problem of a separated supersonic flow field and a conductive solid wall with an embedded heat source is presented. Implicit finite-difference schemes were used to solve the two-dimensional time-dependent compressible Navier-Stokes equations and the time-dependent heat-conduction equation for the solid in both general coordinate systems. A detailed comparison between the thin-layer and Navier-Stokes models was made for steady and unsteady supersonic flow and showed insignificant differences. Steady-state and transient cases were computed and the results show that a temperature pulse at the solid-fluid interface can be used to detect the flow direction near the wall in the vicinity of separation without significant distortion of the flow field.
Directory of Open Access Journals (Sweden)
Yen-Chun Chou
2010-01-01
Full Text Available Perfusion magnetic resonance brain imaging induces temporal signal changes on brain tissues, manifesting distinct blood-supply patterns for the profound analysis of cerebral hemodynamics. We employed independent factor analysis to blindly separate such dynamic images into different maps, that is, artery, gray matter, white matter, vein and sinus, and choroid plexus, in conjunction with corresponding signal-time curves. The averaged signal-time curve on the segmented arterial area was further used to calculate the relative cerebral blood volume (rCBV, relative cerebral blood flow (rCBF, and mean transit time (MTT. The averaged ratios for rCBV, rCBF, and MTT between gray and white matters for normal subjects were congruent with those in the literature.
Institute of Scientific and Technical Information of China (English)
苏宏升
2008-01-01
To make conventional Bayesian optimal classifier possess the abilities of disposing fuzzy information and realizing the automation of reasoning process, a new Bayesian optimal classifier is proposed with fuzzy information embedded. It can not only dispose fuzzy information effectively, but also retain learning properties of Bayesian optimal classifier. In addition, according to the evolution of fuzzy set theory, vague set is also imbedded into it to generate vague Bayesian optimal classifier. It can simultaneously simulate the twofold characteristics of fuzzy information from the positive and reverse directions. Further, a set pair Bayesian optimal classifier is also proposed considering the threefold characteristics of fuzzy information from the positive, reverse, and indeterminate sides. In the end, a knowledge-based artificial neural network (KBANN) is presented to realize automatic reasoning of Bayesian optimal classifier. It not only reduces the computational cost of Bayesian optimal classifier but also improves its classification learning quality.
Zeeman, Grietje; Kujawa, Katarzyna; de Mes, Titia; Hernandez, Lucia; de Graaff, Marthe; Abu-Ghunmi, Lina; Mels, Adriaan; Meulman, Brendo; Temmink, Hardy; Buisman, Cees; van Lier, Jules; Lettinga, Gatze
2008-01-01
Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas production. Post-treatment of the effluent is providing recovery of phosphorus and removal of remaining COD and nitrogen. The total energy saving of the new sanitation concept amounts to 200 MJ/year in comparison with conventional sanitation, moreover 0.14 kg P/p/year and 90 litres of potential reusable water are produced.
Noise-Source Separation Using Internal and Far-Field Sensors for a Full-Scale Turbofan Engine
Hultgren, Lennart S.; Miles, Jeffrey H.
2009-01-01
Noise-source separation techniques for the extraction of the sub-dominant combustion noise from the total noise signatures obtained in static-engine tests are described. Three methods are applied to data from a static, full-scale engine test. Both 1/3-octave and narrow-band results are discussed. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP). A new additional phase-angle-based discriminator for the three-signal method is also introduced.
Energy Technology Data Exchange (ETDEWEB)
Bruegger, M.; Hildebrand, N.; Karlewski, T.; Trautmann, N. (Mainz Univ. (Germany, F.R.). Inst. fuer Kernchemie); Mazumdar, A.K. (Marburg Univ. (Germany, F.R.). Fachbereich Physik); Herrmann, G. (Mainz Univ. (Germany, F.R.). Inst. fuer Kernchemie; Gesellschaft fuer Schwerionenforschung m.b.H., Darmstadt (Germany, F.R.))
1985-02-01
The performance of a high temperature ion source coupled to a helium gas-jet transport system for an efficient mass separation of neutron-rich alkaline earth and lanthanide isotopes is reported and the results of overall efficiency measurements using different cluster materials in the gas-jet are given. A fast, microprocessor controlled tape transport system for ..gamma..-spectroscopic studies on short-lived isotopes is described. Some results on the decay of 3.8sub(-s) /sup 152/Pr are presented.
Hydrocyclonic separation of invasive New Zealand mudsnails from an aquaculture water source
Nielson, R. Jordan; Moffitt, Christine M.; Watten, Barnaby J.
2012-01-01
Invasive New Zealand mudsnails (Potamopyrgus antipodarum, NZMS) have infested freshwater aquaculture facilities in the western United States and disrupted stocking or fish transportation activities because of the risk of transporting NZMS to naive locations. We tested the efficacy of a gravity-fed, hydrocyclonicseparation system to remove NZMS from an aquaculture water source at two design flows: 367 L/min and 257 L/min. The hydrocyclone effectively filtered all sizes of snails (including newly emerged neonates) from inflows. We modeled cumulative recovery of three sizes of snails, and determined that both juvenile and adult sized snails were transported similarly through the filtration system, but the transit of neonates was faster and similar to the transport of water particles. We found that transit times through the filtration system were different between the two flows regardless of snail size, and the hydrocyclone filter operated more as a plug flow system with dispersion, especially when transporting and removing the larger sized adult and juvenile sized snails. Our study supports hydrocyclonic filtration as an important tool to provide snail free water for aquaculture operations that require uninfested water sources.
Directory of Open Access Journals (Sweden)
Goto Masataka
2010-01-01
Full Text Available We describe a novel query-by-example (QBE approach in music information retrieval that allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis of this approach is that the musical mood of retrieved results changes in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change in the volume balance of a query and the genre of the retrieved pieces, called genre classification shift. Such an understanding would allow us to instruct users in how to generate alternative queries without finding other appropriate pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then it allows users remix these parts to change the acoustic features that represent the musical mood of the piece. Experimental results showed that the genre classification shift was actually caused by the volume change in the vocal, guitar, and drum parts.
Impact of organic polyelectrolytes on coagulation of source-separated black water.
Kozminykh, Pavlo; Heistad, Arve; Ratnaweera, Harsha C; Todt, Daniel
2016-07-01
Household wastewater is originated from common people's activities and has a potential harmful impact on the environment if discharged directly without proper treatment. Toilet wastewater or black water (BW) contains urine, faeces, toilet paper and flushing water and it contains the majority of pollutants obtained from a single household. In this study, the focus was on BW treatment using chemical methods. The main goal of current research was to define the possibility and applicability of conventional coagulants and flocculants in direct chemical treatment of vacuum-collected BW to remove particles, organic matter and phosphorous. After the definition of dosing ranges, based on the equivalent doses in conventional municipal and industrial wastewater treatment data, aluminium and iron coagulants, organic polyelectrolytes (polymers with anionic, neutral and cationic charge with different molecular weights) and their various combinations were tested using the well-known jar-test laboratory method to study aggregation and solid-liquid separation processes in raw BW. The most important process parameter during the coagulation was pH level, dependent on the type and doses of metal salts. Some side processes were found to occur while using iron-based coagulants. Dosing of either single coagulants or single polymers did not give satisfactory results, while a combination of aluminium salts and cationic polymers showed high removal rates in total suspended solids, total chemical oxygen demand and ortho-phosphates, reaching 97.8%, 92% and 98.6%, respectively, with the optimal doses of chemicals. Cationic polymers with the lowest molecular weight and highest charge density were the most efficient in combination with aluminium coagulants. PMID:26672384
Korucu, M Kemal; Kaplan, Özgür; Büyük, Osman; Güllü, M Kemal
2016-10-01
In this study, we investigate the usability of sound recognition for source separation of packaging wastes in reverse vending machines (RVMs). For this purpose, an experimental setup equipped with a sound recording mechanism was prepared. Packaging waste sounds generated by three physical impacts such as free falling, pneumatic hitting and hydraulic crushing were separately recorded using two different microphones. To classify the waste types and sizes based on sound features of the wastes, a support vector machine (SVM) and a hidden Markov model (HMM) based sound classification systems were developed. In the basic experimental setup in which only free falling impact type was considered, SVM and HMM systems provided 100% classification accuracy for both microphones. In the expanded experimental setup which includes all three impact types, material type classification accuracies were 96.5% for dynamic microphone and 97.7% for condenser microphone. When both the material type and the size of the wastes were classified, the accuracy was 88.6% for the microphones. The modeling studies indicated that hydraulic crushing impact type recordings were very noisy for an effective sound recognition application. In the detailed analysis of the recognition errors, it was observed that most of the errors occurred in the hitting impact type. According to the experimental results, it can be said that the proposed novel approach for the separation of packaging wastes could provide a high classification performance for RVMs.
Korucu, M Kemal; Kaplan, Özgür; Büyük, Osman; Güllü, M Kemal
2016-10-01
In this study, we investigate the usability of sound recognition for source separation of packaging wastes in reverse vending machines (RVMs). For this purpose, an experimental setup equipped with a sound recording mechanism was prepared. Packaging waste sounds generated by three physical impacts such as free falling, pneumatic hitting and hydraulic crushing were separately recorded using two different microphones. To classify the waste types and sizes based on sound features of the wastes, a support vector machine (SVM) and a hidden Markov model (HMM) based sound classification systems were developed. In the basic experimental setup in which only free falling impact type was considered, SVM and HMM systems provided 100% classification accuracy for both microphones. In the expanded experimental setup which includes all three impact types, material type classification accuracies were 96.5% for dynamic microphone and 97.7% for condenser microphone. When both the material type and the size of the wastes were classified, the accuracy was 88.6% for the microphones. The modeling studies indicated that hydraulic crushing impact type recordings were very noisy for an effective sound recognition application. In the detailed analysis of the recognition errors, it was observed that most of the errors occurred in the hitting impact type. According to the experimental results, it can be said that the proposed novel approach for the separation of packaging wastes could provide a high classification performance for RVMs. PMID:27378630
Hirayama, Y.; Watanabe, Y. X.; Imai, N.; Ishiyama, H.; Jeong, S. C.; Jung, H. S.; Miyatake, H.; Oyaizu, M.; Kimura, S.; Mukai, M.; Kim, Y. H.; Sonoda, T.; Wada, M.; Huyse, M.; Kudryavtsev, Yu.; Van Duppen, P.
2016-06-01
KEK Isotope Separation System (KISS) has been developed at RIKEN to produce neutron rich isotopes with N = 126 to study the β -decay properties for application to astrophysics. The KISS is an element-selective mass-separation system which consists of an argon gas cell-based on laser ion source for atomic number selection and an ISOL mass-separation system. The argon gas cell of KISS is a key component to stop and collect the unstable nuclei produced in a multi-nucleon transfer reaction, where the isotopes of interest will be selectively ionized using laser resonance ionization. We have performed off- and on-line experiments to study the basic properties of the gas cell as well as of the KISS. We successfully extracted the laser-ionized stable 56Fe (direct implantation of a 56Fe beam into the gas cell) atoms and 198Pt (emitted from the 198Pt target by elastic scattering with a 136Xe beam) atoms from the KISS during the commissioning on-line experiments. We furthermore extracted laser-ionized unstable 199Pt atoms and confirmed that the measured half-life was in good agreement with the reported value.
Kilohertz Quasi-Periodic Oscillation Peak Separation is not Constant in the Atoll Source 4U 1608-52
Méndez, M; Wijnands, R; Ford, E C; Van Paradijs, J; Vaughan, B A; Méndez, Mariano; Van der Klis, Michiel; Wijnands, Rudy; Ford, Eric C.; Van Paradijs, Jan; Vaughan, Brian A.
1998-01-01
We present new Rossi X-ray Timing Explorer observations of the low-mass X-ray binary 4U 1608-52 during the decay of its 1998 outburst. We detect by a direct FFT method the existence of a second kilohertz quasi-periodic oscillation (kHz QPO) in its power density spectrum, previously only seen by means of the sensitivity-enhancing `shift and add' technique. This result confirms that 4U 1608-52 is a twin kHz QPO source. The frequency separation between these two QPO decreased significantly, from 325.5 +/- 3.4 Hz to 225.3 +/- 12.0 Hz, as the frequency of the lower kHz QPO increased from 470 Hz to 865 Hz, in contradiction with a simple beat-frequency interpretation. This change in the peak separation of the kHz QPOs is closely similar to that previously seen in Sco X-1, but takes place at a ten times lower average luminosity. We discuss this result within the framework of models that have been proposed for kHz QPO. Beat frequency models where the peak separation is identified with the neutron star spin rate, as we...
Rousta, Kamran; Bolton, Kim; Lundin, Magnus; Dahlén, Lisa
2015-06-01
The present study measures the participation of households in a source separation scheme and, in particular, if the household's application of the scheme improved after two interventions: (a) shorter distance to the drop-off point and (b) easy access to correct sorting information. The effect of these interventions was quantified and, as far as possible, isolated from other factors that can influence the recycling behaviour. The study was based on households located in an urban residential area in Sweden, where waste composition studies were performed before and after the interventions by manual sorting (pick analysis). Statistical analyses of the results indicated a significant decrease (28%) of packaging and newsprint in the residual waste after establishing a property close collection system (intervention (a)), as well as significant decrease (70%) of the miss-sorted fraction in bags intended for food waste after new information stickers were introduced (intervention (b)). Providing a property close collection system to collect more waste fractions as well as finding new communication channels for information about sorting can be used as tools to increase the source separation ratio. This contribution also highlights the need to evaluate the effects of different types of information and communication concerning sorting instructions in a property close collection system. PMID:25817721
Naroznova, Irina; Møller, Jacob; Larsen, Bjarne; Scheutz, Charlotte
2016-04-01
A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered in the biopulp. The biochemical methane potential for the biopulp was 469±7mL CH4/g ash-free mass. Moreover, all Danish and European Union requirements regarding the content of hazardous substances in biomass intended for land application were fulfilled. Compared to other pre-treatment alternatives, the screw-pulping technology showed higher biodegradable material recovery, lower electricity consumption and comparable water consumption. The higher material recovery achieved with the technology was associated with greater transfer of nutrients (N and P), carbon (total and biogenic) but also heavy metals (except Pb) to the produced biomass. The data generated in this study could be used for the environmental assessment of the technology and thus help in selecting the best pre-treatment technology for source separated organic household waste. PMID:26868847
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Bayesian Lensing Shear Measurement
Bernstein, Gary M
2013-01-01
We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...
Malicious Bayesian Congestion Games
Gairing, Martin
2008-01-01
In this paper, we introduce malicious Bayesian congestion games as an extension to congestion games where players might act in a malicious way. In such a game each player has two types. Either the player is a rational player seeking to minimize her own delay, or - with a certain probability - the player is malicious in which case her only goal is to disturb the other players as much as possible. We show that such games do in general not possess a Bayesian Nash equilibrium in pure strategies (i.e. a pure Bayesian Nash equilibrium). Moreover, given a game, we show that it is NP-complete to decide whether it admits a pure Bayesian Nash equilibrium. This result even holds when resource latency functions are linear, each player is malicious with the same probability, and all strategy sets consist of singleton sets. For a slightly more restricted class of malicious Bayesian congestion games, we provide easy checkable properties that are necessary and sufficient for the existence of a pure Bayesian Nash equilibrium....
稀疏盲源分离快速算法%A Fast Algorithm for Blind Sparse Source Separation
Institute of Scientific and Technical Information of China (English)
董天宝; 杨景曙
2012-01-01
In this paper, a fast algorithm for blind sparse source separation is proposed. Sources are estimated by means of minimizing (0-norm which is approximated using a predefined continuous and differentiable function. The proposed algorithm is easy to implement and runs fast. Then the algorithm is compared with several fast sparse reconstruction algorithms such as fast (1-norm minimization algorithm and OMP using synthetic data. Finally, we apply the proposed algorithm to underdetermined blind source separation using real world data. It is experimentally shown that the proposed algorithm runs faster than other algorithms, while acquiring almost the same (or better) quality.%提出一种快速的稀疏信号重构算法,通过定义一个连续可微函数近似l0范数,采用最小化l0范数的方法实现对稀疏源信号的估计.该算法的特点是实现简单,速度快.采用人工生成的信号将算法与通过l1范数最小化的快速稀疏信号重构算法和OMP算法进行了比较.最后,将该算法用于实际信号的欠定盲源分离.仿真实验表明,算法在保证信号分离性能的前提下大幅度提高了算法的运行速度.
New algorithm for underdetermined blind source separation%一种欠定盲源分离新算法
Institute of Scientific and Technical Information of China (English)
董天宝; 杨景曙
2012-01-01
A new two-step algorithm for underdetermined source separation is proposed. Mixing matrix is estimated using clustering methods. Sources are estimated using a fast sparse reconstructed algorithm which defines a continuous and differential function so as to approximate ∮° -norm. The new algorithm runs fast and is easily implemented. It is experimentally shown that the proposed algorithm runs faster than other two underdetermined source separation algorithms using fast minimization ∮1 -norm and OMP methods, while acquiring almost the same quality.%提出了一种基于两步法的欠定盲源分离新算法.在混合矩阵估计阶段,采用基于势函数的聚类方法,在源信号恢复阶段,提出一种快速的稀疏信号重构算法,通过定义一个连续可微函数来近似l0范数,使得l0范数可解.该算法的特点是实现简单、速度快.仿真实验表明,与现有的采用快速l1范数最小化和OMP算法的欠定盲源分离方法相比,提出的算法在保证分离性能的前提下大幅度提高了算法的运行速度.
Directory of Open Access Journals (Sweden)
G. F. Zhu
2014-01-01
Full Text Available Based on direct measurements of half-hourly canopy evapotranspiration (ET; W m−2 using the eddy covariance (EC system and daily soil evaporation (E; mm d−1 using microlysimeters over a crop ecosystem in arid northwest China from 27 May to 14 September in 2013, a Bayesian method was used to simultaneously parameterize the soil surface and canopy resistances in the Shuttleworth–Wallace (S–W model. The posterior distributions of the parameters in most cases were well updated by the multiple measuring dataset with relatively narrow high-probability intervals. There was a good agreement between measured and simulated values of half-hourly ET and daily E with a linear regression being y = 0.84x +0.18 (R2 = 0.83 and y = 1.01x + 0.01 (R2 = 0.82, respectively. The causes of underestimations of ET by the S–W model was mainly attributed to the micro-scale advection, which can contribute an added energy in the form of downward sensible heat fluxes to the ET process. Therefore, the advection process should be taken into accounted in simulating ET in heterogeneous land surface. Also, underestimations were observed on or shortly after rainy days due to direct evaporation of liquid water intercepted in the canopy. Thus, the canopy interception model should be coupled to the S–W model in the long-term ET simulation.
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-01-01
This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated...... and then, secondly, laboratory analyses for eight organic material fractions comprising Danish SSOHW were conducted. No data were found in the literature that fully covered the objectives of the present study. Based on laboratory analyses, all fractions were assigned according to their specific properties...... in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material degradability...
DEFF Research Database (Denmark)
Müller, L.; Schultz, Anna Charlotte; Fonager, J.;
2015-01-01
Norovirus outbreaks occur frequently in Denmark and it can be difficult to establish whether apparently independent outbreaks have the same origin. Here we report on six outbreaks linked to frozen raspberries, investigated separately over a period of 3 months. Norovirus from stools were sequence...... capsid P2 region. In one outbreak at a hospital canteen, frozen raspberries was associated with illness by cohort investigation (relative risk 6·1, 95% confidence interval 3·2–11). Bags of raspberries suspected to be the source were positive for genogroup I and II noroviruses, one typable virus...... was genotype GI.6 (capsid). These molecular investigations showed that the apparently independent outbreaks were the result of one contamination event of frozen raspberries. The contaminated raspberries originated from a single producer in Serbia and were originally not considered to belong to the same batch...
Pires, Carlos; Ribeiro, Andreia
2016-04-01
An efficient nonlinear method of statistical source separation of space-distributed non-Gaussian distributed data is proposed. The method relies in the so called Independent Subspace Analysis (ISA), being tested on a long time-series of the stream-function field of an atmospheric quasi-geostrophic 3-level model (QG3) simulating the winter's monthly variability of the Northern Hemisphere. ISA generalizes the Independent Component Analysis (ICA) by looking for multidimensional and minimally dependent, uncorrelated and non-Gaussian distributed statistical sources among the rotated projections or subspaces of the multivariate probability distribution of the leading principal components of the working field whereas ICA restrict to scalar sources. The rationale of that technique relies upon the projection pursuit technique, looking for data projections of enhanced interest. In order to accomplish the decomposition, we maximize measures of the sources' non-Gaussianity by contrast functions which are given by squares of nonlinear, cross-cumulant-based correlations involving the variables spanning the sources. Therefore sources are sought matching certain nonlinear data structures. The maximized contrast function is built in such a way that it provides the minimization of the mean square of the residuals of certain nonlinear regressions. The issuing residuals, followed by spherization, provide a new set of nonlinear variable changes that are at once uncorrelated, quasi-independent and quasi-Gaussian, representing an advantage with respect to the Independent Components (scalar sources) obtained by ICA where the non-Gaussianity is concentrated into the non-Gaussian scalar sources. The new scalar sources obtained by the above process encompass the attractor's curvature thus providing improved nonlinear model indices of the low-frequency atmospheric variability which is useful since large circulation indices are nonlinearly correlated. The non-Gaussian tested sources (dyads and
Jimenez, Jose; Bott, Charles; Love, Nancy; Bratby, John
2015-12-01
Municipal wastewater contains a mixture of brown (feces and toilet paper), yellow (urine), and gray (kitchen, bathroom and wash) waters. Urine contributes approximately 70-80% of the nitrogen (N), 50-70% of the phosphorus (P) load and 60-70% of the pharmaceutical residues in normal domestic sewage. This study evaluated the impact of different levels of source separation of urine on an existing biological nutrient removal (BNR) process. A process model of an existing biological nutrient removal (BNR) plant was used. Increasing the amount of urine diverted from the water reclamation facilities, has little impact on effluent ammonia (NH₃-N) concentration, but effluent nitrate (NO₃-N) concentration decreases. If nitrification is necessary then no reduction in the sludge age can be realized. However, a point is reached where the remaining influent nitrogen load matches the nitrogen requirements for biomass growth, and no residual nitrogen needs to be nitrified. That allows a significant reduction in sludge age, implying reduced process volume requirements. In situations where nitrification is required, lower effluent nitrate (NO₃-N) concentrations were realized due to both the lower influent nitrogen content in the wastewater and a more favorable nitrogen-to-carbon ratio for denitrification. The external carbon requirement for denitrification decreases as the urine separation efficiency increases due to the lower influent nitrogen content in the wastewater and a more favorable nitrogen-to-carbon ratio for denitrification. The effluent phosphorus concentration decreases when the amount of urine sent to water reclamation facilities is decreased due to lower influent phosphorus concentrations. In the case of chemical phosphate removal, urine separation reduces the amount of chemicals required. PMID:26652123
Jaatinen, Sanna T; Palmroth, Marja R T; Rintala, Jukka A; Tuhkanen, Tuula A
2016-09-01
The behaviour of pharmaceuticals related to the human immunodeficiency virus treatment was studied in the liquid phase of source-separated urine during six-month storage at 20°C. Six months is the recommended time for hygienization and use of urine as fertilizer. Compounds were spiked in urine as concentrations calculated to appear in urine. Assays were performed with separate compounds and as therapeutic groups of antivirals, antibiotics and anti-tuberculotics. In addition, urine was amended either with faeces or urease inhibitor. The pharmaceutical concentrations were monitored from filtered samples with solid phase extraction and liquid chromatography. The concentration reductions of the studied compounds as such or with amendments ranged from less than 1% to more than 99% after six-month storage. The reductions without amendments were 41.9-99% for anti-tuberculotics; <52% for antivirals (except with 3TC 75.6%) and <50% for antibiotics. In assays with amendments, the reductions were all <50%. Faeces amendment resulted in similar or lower reduction than without it even though bacterial activity should have increased. The urease inhibitor prevented ureolysis and pH rise but did not affect pharmaceutical removal. In conclusion, removal during storage might not be enough to reduce risks associated with the studied pharmaceuticals, in which case other feasible treatment practises or urine utilization means should be considered. PMID:26804243
Directory of Open Access Journals (Sweden)
Duarte L.T.
2014-03-01
Full Text Available The development of chemical sensor arrays based on Blind Source Separation (BSS provides a promising solution to overcome the interference problem associated with Ion-Selective Electrodes (ISE. The main motivation behind this new approach is to ease the time-demanding calibration stage. While the first works on this problem only considered the case in which the ions under analysis have equal valences, the present work aims at developing a BSS technique that works when the ions have different charges. In this situation, the resulting mixing model belongs to a particular class of nonlinear systems that have never been studied in the BSS literature. In order to tackle this sort of mixing process, we adopted a recurrent network as separating system. Moreover, concerning the BSS learning strategy, we develop a mutual information minimization approach based on the notion of the differential of the mutual information. The method works requires a batch operation, and, thus, can be used to perform off-line analysis. The validity of our approach is supported by experiments where the mixing model parameters were extracted from actual data.
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Hybrid Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2012-01-01
Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...
Loredo, T J
2004-01-01
I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian and frequentist inequality tests
David M. Kaplan; Zhuo, Longhao
2016-01-01
Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...
A. Korattikara; V. Rathod; K. Murphy; M. Welling
2015-01-01
We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple ap
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...
Institute of Scientific and Technical Information of China (English)
陈霞; 田荣艳
2012-01-01
通过对火工分离装置水下分离噪声产生机理的分析和研究,在满足分离裕度的情况下,研究一种专用火工分离装置,探寻装药量与分离噪声声源级的关系,进行水下噪声对比测试,其研究结果对水中兵器水下分离降嗓设计提供依据.%Based on the analysis and study for the noise generation mechanism of pyrotechnically actuated separation devices which separate underwater, when meet the case of separated margin, study a special pyrotechnical separation device to explore the relationship between the charge weight and the sound source level,and develop the tests for underwater noise. The results provide the basis for noising-reducing of underwater weapons.
Isomer separation of $^{70g}Cu$ and $^{70m}Cu$ with a resonance ionization laser ion source
Köster, U; Mishin, V I; Weissman, L; Huyse, M; Kruglov, K; Müller, W F; Van Duppen, P; Van Roosbroeck, J; Thirolf, P G; Thomas, H C; Weisshaar, D W; Schulze, W; Borcea, R; La Commara, M; Schatz, H; Schmidt, K; Röttger, S; Huber, G; Sebastian, V; Kratz, K L; Catherall, R; Georg, U; Lettry, Jacques; Oinonen, M; Ravn, H L; Simon, H
2000-01-01
Radioactive copper isotopes were ionized with the resonance ionization laser ion source at the on-line isotope separator ISOLDE (CERN). Using the different hyperfine structure in the 3d/sup 10/ 4s /sup 2/S/sub 1/2/-3d/sup 10/ 4p /sup 2/P/sub 1/2//sup 0/ transition the low- and high-spin isomers of /sup 70/Cu were selectively enhanced by tuning the laser wavelength. The light was provided by a narrow-bandwidth dye laser pumped by copper vapor lasers and frequency doubled in a BBO crystal. The ground state to isomeric state intensity ratio could be varied by a factor of 30, allowing to assign gamma transitions unambiguously to the decay of the individual isomers. It is shown that the method can also be used to determine magnetic moments. In a first experiment for the 1/sup +/ ground state of /sup 70/Cu a magnetic moment of (+)1.8(3) mu /sub N/ and for the high-spin isomer of /sup 70/Cu a magnetic moment of (+or-)1.2(3) mu /sub N/ could be deduced. (20 refs).
Sharma, Manu; Hennessy, Ricky; Markey, Mia K; Tunnell, James W
2013-12-01
A two-layer Monte Carlo lookup table-based inverse model is validated with two-layered phantoms across physiologically relevant optical property ranges. Reflectance data for source-detector separations of 370 μm and 740 μm were collected from these two-layered phantoms and top layer thickness, reduced scattering coefficient and the top and bottom layer absorption coefficients were extracted using the inverse model and compared to the known values. The results of the phantom verification show that this method is able to accurately extract top layer thickness and scattering when the top layer thickness ranges from 0 to 550 μm. In this range, top layer thicknesses were measured with an average error of 10% and the reduced scattering coefficient was measured with an average error of 15%. The accuracy of top and bottom layer absorption coefficient measurements was found to be highly dependent on top layer thickness, which agrees with physical expectation; however, within appropriate thickness ranges, the error for absorption properties varies from 12-25%. PMID:24466475
Graf, John; Taylor, Dale; Martinez, James
2014-01-01
]. Combined with a mechanical compressor, a Solid Electrolyte Oxygen Separator (SEOS) should be capable of producing ABO grade oxygen at pressures >2400 psia, on the space station. Feasibility tests using a SEOS integrated with a mechanical compressor identified an unexpected contaminant in the oxygen: water vapour was found in the oxygen product, sometimes at concentrations higher than 40 ppm (the ABO limit for water vapour is 7 ppm). If solid electrolyte membranes are really "infinitely selective" to oxygen as they are reported to be, where did the water come from? If water is getting into the oxygen, what other contaminants might get into the oxygen? Microscopic analyses of wafers, welds, and oxygen delivery tubes were performed in an attempt to find the source of the water vapour contamination. Hot and cold pressure decay tests were performed. Measurements of water vapour as a function of O2 delivery rate, O2 delivery pressure, and process air humidity levels were the most instructive in finding the source of water contamination (Fig 3). Water contamination was directly affected by oxygen delivery rate (doubling the oxygen production rate cut the water level in half). Water was affected by process air humidity levels and delivery pressure in a way that indicates the water was diffusing into the oxygen delivery system.
Liao, Wei; Hua, Xue-Ming; Zhang, Wang; Li, Fang
2014-05-01
In the present paper, the authors calculated the plasma's peak electron temperatures under different heat source separation distance in laser- pulse GMAW hybrid welding based on Boltzmann spectrometry. Plasma's peak electron densities under the corresponding conditions were also calculated by using the Stark width of the plasma spectrum. Combined with high-speed photography, the effect of heat source separation distance on electron temperature and electron density was studied. The results show that with the increase in heat source separation distance, the electron temperatures and electron densities of laser plasma did not changed significantly. However, the electron temperatures of are plasma decreased, and the electron densities of are plasma first increased and then decreased. PMID:25095401
State Information in Bayesian Games
Cuff, Paul
2009-01-01
Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.
Kernel Approximate Bayesian Computation for Population Genetic Inferences
Nakagome, Shigeki; Fukumizu, Kenji; Mano, Shuhei
2012-01-01
Approximate Bayesian computation (ABC) is a likelihood-free approach for Bayesian inferences based on a rejection algorithm method that applies a tolerance of dissimilarity between summary statistics from observed and simulated data. Although several improvements to the algorithm have been proposed, none of these improvements avoid the following two sources of approximation: 1) lack of sufficient statistics: sampling is not from the true posterior density given data but from an approximate po...
Bayesian information fusion networks for biosurveillance applications.
Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S
2009-01-01
This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel N
2012-01-01
Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....
Brody, Samuel; Lapata, Mirella
2009-01-01
Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...
Bayesian Generalized Rating Curves
Helgi Sigurðarson 1985
2014-01-01
A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...
Efficient Bayesian Phase Estimation
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory
2016-04-01
Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.
Nara, T.; Koiwa, K.; Takagi, S.; Oyama, D.; Uehara, G.
2014-05-01
This paper presents an algebraic reconstruction method for dipole-quadrupole sources using magnetoencephalography data. Compared to the conventional methods with the equivalent current dipoles source model, our method can more accurately reconstruct two close, oppositely directed sources. Numerical simulations show that two sources on both sides of the longitudinal fissure of cerebrum are stably estimated. The method is verified using a quadrupolar source phantom, which is composed of two isosceles-triangle-coils with parallel bases.
Wada, Ted S.
In this dissertation, we build a foundation for what we refer to as the system approach to signal enhancement as we focus on the acoustic echo cancellation (AEC) problem. Such a “system” perspective aims for the integration of individual components, or algorithms, into a cohesive unit for the benefit of the system as a whole to cope with real-world enhancement problems. The standard system identification approach by minimizing the mean square error (MSE) of a linear system is sensitive to distortions that greatly affect the quality of the identification result. Therefore, we begin by examining in detail the technique of using a noise-suppressing nonlinearity in the adaptive filter error feedback-loop of the LMS algorithm when there is an interference at the near end, where the source of distortion may be linear or nonlinear. We provide a thorough derivation and analysis of the error recovery nonlinearity (ERN) that “enhances” the filter estimation error prior to the adaptation to transform the corrupted error’s distribution into a desired one, or very close to it, in order to assist the linear adaptation process. We reveal important connections of the residual echo enhancement (REE) technique to other existing AEC and signal enhancement procedures, where the technique is well-founded in the information-theoretic sense and has strong ties to independent component analysis (ICA), which is the basis for blind source separation (BSS) that permits unsupervised adaptation in the presence of multiple interfering signals. Notably, the single-channel AEC problem can be viewed as a special case of semi-blind source separation (SBSS) where one of the source signals is partially known, i.e., the far-end microphone signal that generates the near-end acoustic echo. Indeed, SBSS optimized via ICA leads to the system combination of the LMS algorithm with the ERN that allows continuous and stable adaptation even during double talk. Next, we extend the system perspective
Energy Technology Data Exchange (ETDEWEB)
Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Idris, Azni, E-mail: azni@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia)
2013-05-15
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and
International Nuclear Information System (INIS)
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and
Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni
2013-05-01
Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes. PMID:23415709
Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni
2013-05-01
Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes.
Bayesian optimization for materials design
Frazier, Peter I.; Wang, Jialei
2015-01-01
We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...
Ri, Yong-Wu; Im, Song-Jin
2014-01-01
The modified Beer-Lambert law (MBL) and the spatially resolved spectroscopy are used to measure the tissue oxidation in muscles and brains by the continuous wave near-infrared spectroscopy. The spatially resolved spectroscopy predicts the change in the concentration of the absorber by measuring the slope of attenuation data according to the separation and calculating the absorption coefficients of tissue on the basis of the slop in attenuation at the separation distance satisfying the linearity of this slop. This study analyzed the appropriate source-detector separation distance by using the diffuse approximation resolution for photon migration when predicting the absorption coefficient by the spatially resolved spectroscopy on the basis of the reflective image of the tissue. We imagine the 3 dimensional attenuation image with the absorption coefficient, reduced scattering coefficient and separation distance as its axes and obtained the attenuation data cube by calculating the attenuation on a certain interva...
Bayesian Posteriors Without Bayes' Theorem
Hill, Theodore P
2012-01-01
The classical Bayesian posterior arises naturally as the unique solution of several different optimization problems, without the necessity of interpreting data as conditional probabilities and then using Bayes' Theorem. For example, the classical Bayesian posterior is the unique posterior that minimizes the loss of Shannon information in combining the prior and the likelihood distributions. These results, direct corollaries of recent results about conflations of probability distributions, reinforce the use of Bayesian posteriors, and may help partially reconcile some of the differences between classical and Bayesian statistics.
Institute of Scientific and Technical Information of China (English)
谭清磊; 陈国明; 付建民
2012-01-01
高含硫集气站中,井口分离器风险分析是集气站安全运行管理的重要环节.传统的因果图风险评价分析方法具有简洁直观,逻辑性强的优点,但具有一定的局限性.贝叶斯网络是一种较新的系统风险分析方法,能较好地表达变量之间的不确性关系且具有双向不确定性推理能力,但不如前者形象直观.采用因果图和贝叶斯网络对分离器液位过低事故进行分析并对比,充分利用两者的优点.结果表明,利用两种方法可以更好地对分离器进行风险分析.%The present paper is aimed at introducing the combination of cause-effect diagram and Bayesian network in hoping to improve the risk assessment of the wellhead separator in high-sulfur natural gas gathering station. As is known, conventional evaluation methods, such as the cause-effect diagram, are likely to produce ineffective and inaccurate assessment, due to their limitations in assessing events binary states and necessary logical relationship of the problems involved , due to their being too simple in structure and intuitive in slow reaction. Since Bayesian network is a kind of rather new risk analysis method, it has made it possible to better express the uncertainty a-mong the variables, reason out the two-way uncertainty, and describe multi-states of the events more clearly. Moreover, it can help to produce an image more accurately than before. This paper would like to analyze low liquid-level separator by cause-effect diagram via Bayesian network and then make a comparison between the results. What is more, we have carefully studied the advantages of the above-said methods. To be sure, first of all, we have simulated the accidents caused by the low liquid level of the* separator via the cause-effect diagram. Next, we have tested and analyzed the wellhead separator security barrier failure and the likely reasons of accident cropping-up so as to boil down the likely causes to the minimum set. In case
A high-resolution direction-of-arrival estimation based on Bayesian method
Institute of Scientific and Technical Information of China (English)
HUANG Jianguo; SUN Yi; XU Pu; LU Ying; LIU Kewei
2004-01-01
A Bayesian high-resolution direction-of-arrival (DOA) estimator is proposed based on the maximum a posteriori principle. The statistical performance of the Bayesian highresolution DOA estimator is also investigated. Comparison with MUSIC and Maximum likelihood estimator (MLE) shows that the Bayesian method has higher resolution and more accurate estimates for either incoherent or coherent sources. It is also more robust in the case of low SNR.
Institute of Scientific and Technical Information of China (English)
刘海林; 谢胜利; 章晋龙
2003-01-01
The paper researches separability of blind sources of convolutive mixtures on the assumption that the num-ber of sensors is less than that of sources. When time delay is small, the necessary condition of separability of the ill-conditioned convolutive mixtures is given through transforming blind sources of convolutive mixtures into that of lin-ear mixtures.
Dongliang Zhang; Guangqing Huang; Xiaoling Yin; Qinghua Gong
2015-01-01
Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perc...
Zou, Yonghong; Wang, Lixia; Christensen, Erik R
2015-10-01
This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). PMID:26208321
Bayesian parameter estimation for effective field theories
Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A
2015-01-01
We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Applications of Bayesian spectrum representation in acoustics
Botts, Jonathan M.
framework. The application to reflection data is useful for representing frequency-dependent impedance boundaries in finite difference acoustic simulations. Furthermore, since the filter transfer function is a parametric model, it can be modified to incorporate arbitrary frequency weighting and account for the band-limited nature of measured reflection spectra. Finally, the model is modified to compensate for dispersive error in the finite difference simulation, from the filter design process. Stemming from the filter boundary problem, the implementation of pressure sources in finite difference simulation is addressed in order to assure that schemes properly converge. A class of parameterized source functions is proposed and shown to offer straightforward control of residual error in the simulation. Guided by the notion that the solution to be approximated affects the approximation error, sources are designed which reduce residual dispersive error to the size of round-off errors. The early part of a room impulse response can be characterized by a series of isolated plane waves. Measured with an array of microphones, plane waves map to a directional response of the array or spatial intensity map. Probabilistic inversion of this response results in estimates of the number and directions of image source arrivals. The model-based inversion is shown to avoid ambiguities associated with peak-finding or inspection of the spatial intensity map. For this problem, determining the number of arrivals in a given frame is critical for properly inferring the state of the sound field. This analysis is effectively compression of the spatial room response, which is useful for analysis or encoding of the spatial sound field. Parametric, model-based formulations of these problems enhance the solution in all cases, and a Bayesian interpretation provides a principled approach to model comparison and parameter estimation. v
Bayesian parameter estimation for effective field theories
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Multisnapshot Sparse Bayesian Learning for DOA
Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki; Nannuru, Santosh
2016-10-01
The directions of arrival (DOA) of plane waves are estimated from multi-snapshot sensor array data using Sparse Bayesian Learning (SBL). The prior source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters the unknown variances (i.e. the source powers). For a complex Gaussian likelihood with hyperparameter the unknown noise variance, the corresponding Gaussian posterior distribution is derived. For a given number of DOAs, the hyperparameters are automatically selected by maximizing the evidence and promote sparse DOA estimates. The SBL scheme for DOA estimation is discussed and evaluated competitively against LASSO ($\\ell_1$-regularization), conventional beamforming, and MUSIC
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Computationally efficient Bayesian tracking
Aughenbaugh, Jason; La Cour, Brian
2012-06-01
In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;
2009-01-01
Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Fedosseev, V; Marsh, B A; CERN. Geneva. AB Department
2006-01-01
At the ISOLDE on-line isotope separation facility, the resonance ionization laser ion source (RILIS) can be used to ionize reaction products as they effuse from the target. The RILIS process of laser step-wise resonance ionization of atoms in a hot metal cavity provides a highly element selective stage in the preparation of the radioactive ion beam. As a result, the ISOLDE mass separators can provide beams of a chosen isotope with greatly reduced isobaric contamination. The number of elements available at RILIS has been extended to 26, with the addition of a new three-step ionization scheme for gold. The optimal ionization scheme was determined during an extensive study of the atomic energy levels and auto-ionizing states of gold, carried out by means of in-source resonance ionization spectroscopy. Details of the ionization scheme and a summary of the spectroscopy study are presented.
Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions
Directory of Open Access Journals (Sweden)
Richard M. Todd
1988-03-01
Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.
The Application of Compressed Sensing in Blind Source Separation%压缩感知原理在盲信号分离中的应用
Institute of Scientific and Technical Information of China (English)
王涛文
2012-01-01
Compressed Sensing has been a new signal sampling theory in recent years, for it overcomes the high rate of sampling defects of traditional Nyquist signal sampling theory, It presented the basic principles of Compressed Sensing, introduced three fundamental questions of Compressed Sensing-the sparse-ness of signals 、irrelevance between sparse matrix and measurement matrix, and reconstruction of the signals, and analyzed the contact between Compressed Sensing and Blind Source Separation. Then, it offered a new way to solve the problem of Blind Source Separation. Finally, through the experiment it showed its application in Blind Source Separation.%主要阐述了压缩感知的基本原理,介绍了压缩感知的3个基本问题:信号的稀疏表示、稀疏基与测量矩阵的不相关性和信号的重构,分析了它与盲信号分离之间的联系,为解决盲信号分离问题提供了一个新的途径.最后通过具体实验说明它在盲信号分离上的应用.
Ravat, D.; Kirkham, K.; Hildenbrand, T.G.
2002-01-01
An overview is given on the benefits of applying the Euler method on derivatives of anomalies to enhance the location of shallow and deep sources. Used properly, the method is suitable for characterizing sources from all potential-field data and/or their derivative, as long as the data can be regarded mathematically as "continuous". Furthermore, the reasons why the use of the Euler method on derivatives of anomalies is particularly helpful in the analysis and interpretation of shallow features are explained.
Bloomfield, John; Ward, Rob; Garcia-Bajo, Marieta; Hart, Alwyn
2014-05-01
A number of potential pathways can be identified for the migration of methane and contaminants associated with the shale gas extraction process to aquifers. These include the possible movement of contaminants from shale gas reservoirs that have been hydraulically fractured to overlying aquifers. The risk of contamination of an overlying aquifer is a function of i.) the separation of the potential shale gas source rock and the aquifer, ii.) the hydraulic characteristics (e.g. hydraulic conductivity, storage and hydrogeochemistry) of the rocks in the intervening interval, and iii.) regional and local physio-chemical gradients. Here we report on a national-scale study from the UK to assess the former, i.e. the vertical separation between potential shale gas source rocks and major aquifers, as a contribution to more informed management of the risks associated with shale gas development if and when it takes place in the UK. Eleven aquifers are considered in the study. These are aquifers that have been designated by the environment agencies of England (Environment Agency) and Wales (Natural Resources Wales) under the EU Water Framework Directive as being nationally important (Principal Aquifers). The shale gas source rocks have been defined on best publically available evidence for potential gas productivity and include both shales and clay formations. Based on a national geological fence diagram consisting of ~80 geological sections, totalling ~12,000km in length, down to >5km in depth, and with a typical spacing of 30km, the lower surfaces of each aquifer unit and upper surfaces of each shale/clay unit have been estimated at a spatial resolution of 3x3km. These surfaces have then been used to estimate vertical separations between pairs of shale/clay and aquifer units. The modelling process will be described and the aquifer, shale and separation maps presented and discussed. The aquifers are defined by geological units and since these geological units may be found at
Directory of Open Access Journals (Sweden)
Chengjie Li
2016-01-01
Full Text Available In Passive Radar System, obtaining the mixed weak object signal against the super power signal (jamming is still a challenging task. In this paper, a novel framework based on Passive Radar System is designed for weak object signal separation. Firstly, we propose an Interference Cancellation algorithm (IC-algorithm to extract the mixed weak object signals from the strong jamming. Then, an improved FastICA algorithm with K-means cluster is designed to separate each weak signal from the mixed weak object signals. At last, we discuss the performance of the proposed method and verify the novel method based on several simulations. The experimental results demonstrate the effectiveness of the proposed method.
Dynamic Bayesian diffusion estimation
Dedecius, K
2012-01-01
The rapidly increasing complexity of (mainly wireless) ad-hoc networks stresses the need of reliable distributed estimation of several variables of interest. The widely used centralized approach, in which the network nodes communicate their data with a single specialized point, suffers from high communication overheads and represents a potentially dangerous concept with a single point of failure needing special treatment. This paper's aim is to contribute to another quite recent method called diffusion estimation. By decentralizing the operating environment, the network nodes communicate just within a close neighbourhood. We adopt the Bayesian framework to modelling and estimation, which, unlike the traditional approaches, abstracts from a particular model case. This leads to a very scalable and universal method, applicable to a wide class of different models. A particularly interesting case - the Gaussian regressive model - is derived as an example.
A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics
Guangkuo Lu; Manlin Xiao; Ping Wei; Huaguo Zhang
2015-01-01
Methods of utilizing independent component analysis (ICA) give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments b...
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
MCMC joint separation and segmentation of hidden Markov fields
Snoussi, H; Snoussi, Hichem; Mohammad-Djafari, Ali
2002-01-01
In this contribution, we consider the problem of the blind separation of noisy instantaneously mixed images. The images are modelized by hidden Markov fields with unknown parameters. Given the observed images, we give a Bayesian formulation and we propose to solve the resulting data augmentation problem by implementing a Monte Carlo Markov Chain (MCMC) procedure. We separate the unknown variables into two categories: 1. The parameters of interest which are the mixing matrix, the noise covariance and the parameters of the sources distributions. 2. The hidden variables which are the unobserved sources and the unobserved pixels classification labels. The proposed algorithm provides in the stationary regime samples drawn from the posterior distributions of all the variables involved in the problem leading to a flexibility in the cost function choice. We discuss and characterize some problems of non identifiability and degeneracies of the parameters likelihood and the behavior of the MCMC algorithm in this case. F...
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Energy Technology Data Exchange (ETDEWEB)
Baba, Justin S [ORNL; Akl, Tony [Texas A& M University; Cote, Gerard L. [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL
2011-01-01
An implanted system is being developed to monitor transplanted liver health during the critical 7-10 day period posttransplantation. The unit will monitor organ perfusion and oxygen consumption using optically-based probes placed on both the inflow and outflow blood vessels, and on the liver parenchymal surface. Sensing probes are based on a 3- wavelength LED source and a photodiode detector. Sample diffuse reflectance is measured at 735, 805, and 940 nm. To ascertain optimal source-to-photodetector spacing for perfusion measurement in blood vessels, an ex vivo study was conducted. In this work, a dye mixture simulating 80% blood oxygen saturation was developed and perfused through excised porcine arteries while collecting data for various preset probe source-to-photodetector spacings. The results from this study demonstrate a decrease in the optical signal with decreasing LED drive current and a reduction in perfusion index signal with increasing probe spacing. They also reveal a 2- to 4-mm optimal range for blood vessel perfusion probe source-to-photodetector spacing that allows for sufficient perfusion signal modulation depth with maximized signal to noise ratio (SNR). These findings are currently being applied to guide electronic configuration and probe placement for in vivo liver perfusion porcine model studies.
Bayesian networks as a tool for epidemiological systems analysis
Lewis, F. I.
2012-11-01
Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.
Irregular-Time Bayesian Networks
Ramati, Michael
2012-01-01
In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...
Separation and quantification of frequency coupled noise sources of submarine cabin%舱段模型频率耦合噪声源的分离量化
Institute of Scientific and Technical Information of China (English)
李思纯; 宫元彬; 时胜国; 于树华; 韩闯
2016-01-01
Traditional methods do not effectively handle separation and quantification of coupled vibration noise sources in submarines. So a new multivariate statistical analysis method, partial least square regression ( PLS) , is presented, which can be used to separate and quantify frequency coupled noise sources. PLS has the characteristic of simultaneously extracting principal input/output components, including maximum information, correlation of in⁃put with output, and regression modeling with multiple correlations among variables. Simulation and cabin model experiments show that, when there is frequency coupling between multiple excitation sources, PLS is capable of sorting among the energy contributions of internal noise sources to submarine hull, submarine hull to underwater a⁃coustic field, and noise sources to underwater acoustic field. The feasibility of PLS for frequency coupled source separation and quantification is proven. The method provides a basis for the control of the main noise sources.%由于潜艇振动噪声源存在频率相互耦合现象，常规方法难以有效地解决耦合噪声源分离与贡献量化问题。采用一种新型多元统计分析方法－偏最小二乘回归分析方法来实现频率耦合噪声源的分离量化，该方法可同时提取反映输入／输出中最大信息且相关性最大的主成分，并能够在变量间存在多重相关性的条件下进行回归建模。仿真与舱段模型试验表明：当多激励源之间存在频率耦合时，能对噪声源进行分离和贡献量化，从而实现了噪声源对耐压壳体观测点贡献以及噪声源对辐射声场观测点贡献的排序，验证了偏最小二乘回归用于频率耦合源分离量化的可行性，为主要噪声源的控制提供了依据。
Energy Technology Data Exchange (ETDEWEB)
Chen, Yi-Ren; Chou, Li-Chang; Yang, Ying-Jay [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Lin, Hao-Hsiung, E-mail: hhlin@ntu.edu.tw [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Department of Electrical Engineering and Graduate Institute of Photonics and Optoelectronics, National Taiwan University, Taipei 10617, Taiwan (China)
2012-04-30
This work describes a regular solution model that considers the free energy of the surface monolayer to explain the orientation-dependent phase separation in GaAsSb. In the proposed model, only the interaction between the second nearest-neighboring atoms sitting on the same monolayer contributes to the interaction parameter. Consequently, the parameter reduces to {Omega}/2 and {Omega}/3 for (111)B GaAsSb and (100) GaAsSb, where {Omega} denotes the parameter of bulk GaAsSb. By including the strain effect, the proposed model thoroughly elucidates the immiscibility behavior of (111)B GaAsSb and (100) GaAsSb. - Highlights: Black-Right-Pointing-Pointer (111)B GaAsSb exhibits severe phase separation than (100) GaAsSb. Black-Right-Pointing-Pointer We propose a model to calculate the monolayer free energy of different planes. Black-Right-Pointing-Pointer Monolayer model suggests that (111)B GaAsSb has larger interaction parameter. Black-Right-Pointing-Pointer Monolayer model including strain well explains the immiscibility of GaAsSb.
Neuronanatomy, neurology and Bayesian networks
Bielza Lozoya, Maria Concepcion
2014-01-01
Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the
Directory of Open Access Journals (Sweden)
Hiroshi Sawada
2007-01-01
Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the ℓ1-norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.
Advanced REACH tool: A Bayesian model for occupational exposure assessment
McNally, K.; Warren, N.; Fransman, W.; Entink, R.K.; Schinkel, J.; Van Tongeren, M.; Cherrie, J.W.; Kromhout, H.; Schneider, T.; Tielemans, E.
2014-01-01
This paper describes a Bayesian model for the assessment of inhalation exposures in an occupational setting; the methodology underpins a freely available web-based application for exposure assessment, the Advanced REACH Tool (ART). The ART is a higher tier exposure tool that combines disparate sourc
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
International Nuclear Information System (INIS)
The resonance ionization laser ion source (RILIS) of the ISOLDE on-line isotope separation facility is based on the method of laser stepwise resonance ionization of atoms in a hot metal cavity. The atomic selectivity of the RILIS compliments the mass selection process of the ISOLDE separator magnets to provide beams of a chosen isotope with greatly reduced isobaric contamination. Using a system of dye lasers pumped by copper vapor lasers, ion beams of 22 elements have been generated at ISOLDE with ionization efficiencies in the range of 0.5%-30%. As part of the ongoing RILIS development, recent off-line resonance ionization spectroscopy studies have determined the optimal three-step ionization schemes for yttrium, scandium, and antimony
Catherall, R.; Fedosseev, V. N.; Köster, U.; Lettry, J.; Suberlucq, G.; Marsh, B. A.; Tengborn, E.
2004-05-01
The resonance ionization laser ion source (RILIS) of the ISOLDE on-line isotope separation facility is based on the method of laser stepwise resonance ionization of atoms in a hot metal cavity. The atomic selectivity of the RILIS compliments the mass selection process of the ISOLDE separator magnets to provide beams of a chosen isotope with greatly reduced isobaric contamination. Using a system of dye lasers pumped by copper vapor lasers, ion beams of 22 elements have been generated at ISOLDE with ionization efficiencies in the range of 0.5%-30%. As part of the ongoing RILIS development, recent off-line resonance ionization spectroscopy studies have determined the optimal three-step ionization schemes for yttrium, scandium, and antimony.
Energy Technology Data Exchange (ETDEWEB)
Naresh Shah; Frank E. Huggins; Gerald P. Huffman
2006-07-31
Coal combustion is generally viewed as a major source of PM2.5 emissions into the atmosphere. For some time, toxicologists have been asking for an exposure environment enriched with the coal combustion source specific PM{sub 2.5} to conduct meaningful exposure studies to better understand the mechanisms of the adverse health effects of coal combustion specific PM2.5 in the ambient environment. There are several unique characteristics of primary PM generated from coal combustion. In this research project, an attempt has been made to exploit some of the unique properties of PM generated from coal fired power plants to preferentially separate them out from the rest of the primary and secondary PM in the ambient environment. An existing FRM sampler used for monitoring amount of PM{sub 2.5} in the ambient air is modified to incorporate an electrostatic field. A DC corona charging device is also installed at the ambient air inlet to impart positive or negative charge to the PM. Visual Basic software has been written to simulate the lateral movement of PM as it passes through the electrostatic separator under varying operating conditions. The PM samples collected on polycarbonate filters under varying operating conditions were extensively observed for clustering and/or separation of PM in the direction parallel to the electric field. No systematic PM separation was observed under any of the operating conditions. A solution to overcome this kind of turbulence caused remixing has been offered. However, due to major programmatic changes in the DOE UCR program, there are no venues available to further pursue this research.
Dale Poirier
2008-01-01
This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.
Dynamic Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2011-01-01
Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...
Nonparametric Bayesian Classification
Coram, M A
2002-01-01
A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...
Uncertainty Modeling Based on Bayesian Network in Ontology Mapping
Institute of Scientific and Technical Information of China (English)
LI Yuhua; LIU Tao; SUN Xiaolin
2006-01-01
How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.
Bayesian Inference in the Modern Design of Experiments
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Comparing Bayesian models for multisensory cue combination without mandatory integration
Beierholm, Ulrik R.; Shams, Ladan; Kording, Konrad P; Ma, Wei Ji
2009-01-01
Bayesian models of multisensory perception traditionally address the problem of estimating an underlying variable that is assumed to be the cause of the two sensory signals. The brain, however, has to solve a more general problem: it also has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these ha...
International Nuclear Information System (INIS)
We report on a two-photon interference experiment in a quantum relay configuration using two picosecond regime periodically poled lithium niobate (PPLN) waveguide based sources emitting paired photons at 1550 nm. The results show that the picosecond regime associated with a guided-wave scheme should have important repercussions for quantum relay implementations in real conditions, essential for improving both the working distance and the efficiency of quantum cryptography and networking systems. In contrast to already reported regimes, namely, femtosecond and CW, it allows achieving a 99% net visibility two-photon interference while maintaining a high effective photon pair rate using only standard telecom components and detectors.
International Nuclear Information System (INIS)
The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 1010 ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions
Energy Technology Data Exchange (ETDEWEB)
Barminova, H. Y., E-mail: barminova@bk.ru; Saratovskyh, M. S. [National Research Nuclear University MEPhI, Kashirskoye sh. 31, Moscow 115409 (Russian Federation)
2016-02-15
The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10{sup 10} ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.
Catherall, Richard; Köster, U; Lettry, Jacques; Suberlucq, Guy; Marsh, Bruce A; Tengborn, Elisabeth
2004-01-01
The production of radioactive ionization laser ion source (RILIS) of ISOLDE on-line isotope separation facility was investigated. The RILIS setup included three dye lasers and ionization schemes which employ three resonant transitions were also used. The RILIS efficiency could be reduced by nuclear effects such as hyperfine splitting and isotope shifts. The off-line resonance ionization spectroscopy determined optimal three-step ionization schemes for yttrium, scandium and antimony and antimony. The results show that best ionization schemes of Y provided gain factor of 15 with respect to surface ionization. (Edited abstract) 8 Refs.
Directory of Open Access Journals (Sweden)
C. O. Torres-Cortés
2016-08-01
Full Text Available The aim of this work is twofold: to optimize the radiochemical separation of Plutonium (Pu from soil samples, and to measure the Pu concentration. Soil samples were prepared using acid digestion assisted by microwaves; then, Pu purification was carried out with Pu AG1X8 resin. Pu isotopes were measured using Mass Spectrometry with Magnetic Sector with Inductively Coupled Plasma source (ICP-SFMS. In order to reduce the interference due to the presence of 238UH+ in the samples a desolvation system (Apex was used. The limit of detection (LOD of Pu was determined. The efficiency of Pu recovery from soil samples varies from 70 to 93%.
Bayesian inference tools for inverse problems
Mohammad-Djafari, Ali
2013-08-01
In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and
Magnetic Separation in Romania
Rezlescu, Nicolae; Bradu, Elena-Brandusa; Iacob, Gheorghe; Badescu, Vasile; Iacob, Lavinia
1986-01-01
The utilization of the magnetic separators of foreign and Romanian source is presented and the most important achievements in research, engineering design and manufacturing activity concerning the magnetic separation in Romania are reviewed.
A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics
Directory of Open Access Journals (Sweden)
Guangkuo Lu
2015-01-01
Full Text Available Methods of utilizing independent component analysis (ICA give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments by applying a modified segmentation algorithm. Then the state space is reconstructed and the single-channel signal is transformed into a pseudo multiple input multiple output (MIMO mode using a method of nonlinear analysis based on the high order statistics (HOS. In the last step, ICA is performed on the pseudo MIMO data to decompose the single channel recording into its underlying independent components (ICs and the interested ICs are then extracted. Finally, the effectiveness and excellence of the higher order single-channel ICA (SCICA method are validated with measured data throughout experiments. Also, the proposed method in this paper is proved to be more robust under different SNR and/or embedding dimension via explicit formulae and simulations.
Technical note: Bayesian calibration of dynamic ruminant nutrition models.
Reed, K F; Arhonditsis, G B; France, J; Kebreab, E
2016-08-01
Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling.
Bayesian inference and Markov chain Monte Carlo in imaging
Higdon, David M.; Bowsher, James E.
1999-05-01
Over the past 20 years, many problems in Bayesian inference that were previously intractable can now be fairly routinely dealt with using a computationally intensive technique for exploring the posterior distribution called Markov chain Monte Carlo (MCMC). Primarily because of insufficient computing capabilities, most MCMC applications have been limited to rather standard statistical models. However, with the computing power of modern workstations, a fully Bayesian approach with MCMC, is now possible for many imaging applications. Such an approach can be quite useful because it leads not only to `point' estimates of an underlying image or emission source, but it also gives a means for quantifying uncertainties regarding the image. This paper gives an overview of Bayesian image analysis and focuses on applications relevant to medical imaging. Particular focus is on prior image models and outlining MCMC methods for these models.
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Directory of Open Access Journals (Sweden)
Shu-Yin Chiang
2002-01-01
Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.
Case studies in Bayesian microbial risk assessments
Directory of Open Access Journals (Sweden)
Turner Joanne
2009-12-01
Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second
Bayesian Analysis of High Dimensional Classification
Mukhopadhyay, Subhadeep; Liang, Faming
2009-12-01
Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.
利用盲源分离算法实现DOA估计%DOA estimation method based on blind source separation algorithm
Institute of Scientific and Technical Information of China (English)
徐先峰; 刘义艳; 段晨东
2012-01-01
A new DOA (direction-of-arrival) estimation method based on an algorithm for fast blind source separation (FBSS-DOA) is proposed in this paper. A group of correlation matrices possessing diagonal structure is generated. A cost function of joint diagonalization for blind source separation is introduced. For solving this cost function, a fast multiplied iterative algorithm in complex-valued domain is utilized. The demixing matrix was then estimated and the estimation of DOA was realized. Compared with familiar algorithms, the algorithm has more generality and better estimation performance. The simulation results illustrate its efficiency.%提出一种基于快速盲源分离算法实现波达方向(DOA)估计的方法.构造了具有对角化结构的相关矩阵组,引入解盲源分离问题的联合对角化代价函数,采用一种快速的复数域乘性迭代算法求解代价函数,得到混迭矩阵逆的估计,进而实现DOA估计.与同类算法相比,该算法具有更广的适用性和更精确的DOA估计性能.仿真实验结果验证了算法的快速收敛性和优越的估计性能.
BayesWave: Bayesian Inference for Gravitational Wave Bursts and Instrument Glitches
Cornish, Neil J
2014-01-01
A central challenge in Gravitational Wave Astronomy is identifying weak signals in the presence of non-stationary and non-Gaussian noise. The separation of gravitational wave signals from noise requires good models for both. When accurate signal models are available, such as for binary Neutron star systems, it is possible to make robust detection statements even when the noise is poorly understood. In contrast, searches for "un-modeled" transient signals are strongly impacted by the methods used to characterize the noise. Here we take a Bayesian approach and introduce a multi-component, variable dimension, parameterized noise model that explicitly accounts for non-stationarity and non-Gaussianity in data from interferometric gravitational wave detectors. Instrumental transients (glitches) and burst sources of gravitational waves are modeled using a Morlet-Gabor continuous wavelet basis. The number and placement of the wavelets is determined by a trans-dimensional Reversible Jump Markov Chain Monte Carlo algor...
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Shashilov, V. A.; Lednev, I. K.
2007-11-01
Amyloid fibrils are associated with many neurodegenerative diseases. The application of conventional biophysical techniques including solution NMR and X-ray crystallography for structural characterization of fibrils is limited because they are neither crystalline nor soluble. The Bayesian approach was utilized for extracting the deep UV resonance Raman (DUVRR) spectrum of the lysozyme fibrillar β-sheet based on the hydrogen-deuterium exchange spectral data. The problem was shown to be unsolvable when using blind source separation or conventional chemometrics methods because of the 100% correlation of the concentration profiles of the species under study. Information about the mixing process was incorporated by forcing the columns of the concentration matrix to be proportional to the expected concentration profiles. The ill-conditioning of the matrix was removed by concatenating it to the diagonal matrix with entries corresponding to the known pure spectra (sources). Prior information about the spectral features and characteristic bands of the spectra was taken into account using the Bayesian signal dictionary approach. The extracted DUVRR spectrum of the cross-β sheet core exhibited sharp bands indicating the highly ordered structure. Well resolved sub-bands in Amide I and Amide III regions enabled us to assign the fibril core structure to anti-parallel β-sheet and estimate the amide group facial angle Ψ in the cross-β structure. The elaborated Bayesian approach was demonstrated to be applicable for studying correlated biochemical processes.
A Bayesian estimation of the helioseismic solar age
Bonanno, Alfio
2015-01-01
The helioseismic determination of the solar age has been a subject of several studies because it provides us with an independent estimation of the age of the solar system. We present the Bayesian estimates of the helioseismic age of the Sun, which are determined by means of calibrated solar models that employ different equations of state and nuclear reaction rates. We use 17 frequency separation ratios $r_{02}(n)=(\
2015-01-01
Sources Fondation Pablo Iglesias. Alcala de Henares. Sections : Archives privées de Manuel ArijaArchives extérieuresArchives FNJS de EspañaPrensa Archives Générales de l’Administration. Alcala de Henares. Sections : Opposition au franquismeSig. 653 Sig TOP 82/68.103-68.602.Índice de las cartas colectivas, Relaciones, Cartas al Ministro de Información de Marzo de 1965. c.662. Sources cinématographiques Filmothèque Nationale d’Espagne.NO.DO. N° 1157C. 08/03/1965.aguirre Javier, Blanco vertical....
Bayesian variable selection with spherically symmetric priors
De Kock, M B
2014-01-01
We propose that Bayesian variable selection for linear parametrisations with Gaussian iid likelihoods be based on the spherical symmetry of the diagonalised parameter space. This reduces the multidimensional parameter space problem to one dimension without the need for conjugate priors. Combining this likelihood with what we call the r-prior results in a framework in which we can derive closed forms for the evidence, posterior and characteristic function for four different r-priors, including the hyper-g prior and the Zellner-Siow prior, which are shown to be special cases of our r-prior. Two scenarios of a single variable dispersion parameter and of fixed dispersion are studied separately, and asymptotic forms comparable to the traditional information criteria are derived. In a simple simulation exercise, we find that model comparison based on our uniform r-prior appears to fare better than the current model comparison schemes.
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
Perception, illusions and Bayesian inference.
Nour, Matthew M; Nour, Joseph M
2015-01-01
Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
Bayesian variable order Markov models: Towards Bayesian predictive state representations
C. Dimitrakakis
2009-01-01
We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
A Bayesian Nonparametric Approach to Test Equating
Karabatsos, George; Walker, Stephen G.
2009-01-01
A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…
Phycas: software for Bayesian phylogenetic analysis.
Lewis, Paul O; Holder, Mark T; Swofford, David L
2015-05-01
Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605
Tsyganov, Y S
2015-01-01
General philosophy of procedure of detecting rare events in the recent experiments with 48Ca projectile at the Dubna Gas-Filled Recoil Separator(DGFRS) aimed to the synthesis of superheavy elements (SHE) has been reviewed. Specific instruments and methods are under consideration. Some historical sources of the successful experiments for Z=112-118 are considered too. Special attention is paid to application of method of active correlations in heavy-ion induced complete fusion nuclear reactions. Example of application in Z=115 experiment is presented. Brief description of the 243Am + 48Ca -> 291-x115+xn experiment is presented too. Some attention is paid to the role of chemical experiments in discoveries of SHEs. The DGFRS detection/monitoring system is presented in full firstly.
Gibson, John; Yi, Yi; Birks, Jean
2016-04-01
Hydrograph separation using stable isotopes of water is used to partition streamflow sources in the Athabasca River and its tributaries in the oil sands region of northern Alberta, Canada. Snow, rain, groundwater and surface water contributions to total streamflow are estimated for multi-year records and provide considerable insight into runoff generation mechanisms operating in six tributaries and at four stations along the Athabasca River. Groundwater, found to be an important flow source at all stations, is the dominant component of the hydrograph in three tributaries (Steepbank R., Muskeg R., Firebag R.), accounting for 39 to 50% of annual streamflow. Surface water, mainly drainage from peatlands, is also found to be widely important, and dominant in three tributaries (Clearwater R., Mackay R., Ells R.), accounting for 45 to 81% of annual streamflow. Direct runoff of precipitation sources including rain (7-19%) and snowmelt (3-7%) account for the remainder of sources. Fairly limited contributions from direct precipitation illustrate that most snow and rain events result in indirect displacement of pre-event water (surface water and groundwater), due in part to the prevalence of fill and spill mechanisms and limited overland flow. Systematic shifts in the groundwater:surface-water ratios, noted for the main stem of the Athabasca River and in its tributaries, is an important control on the spatial and temporal distribution of major and minor ions, trace elements, dissolved organics and contaminants, as well as for evaluating the susceptibility of the rivers to climate and development-related impacts. Runoff partitioning is likely to be a useful monitoring tool for better understanding of flow drivers and water quality controls, and for determining the underlying causes of climate or industrial impacts.
单观测通道船舶辐射噪声盲源分离%Blind source separation of ship-radiated noise using single observing channel
Institute of Scientific and Technical Information of China (English)
刘佳; 杨士莪; 朴胜春; 黄益旺
2011-01-01
提出了一种适用于单观测通道的船舶辐射噪声盲源分离方法.该方法依据船舶辐射噪声远场的空间分布规律,通过将单观测通道延时和滤波的方法构造虚拟通道,使单通道转化为多通道,以实现单通道的盲源分离.仿真及实验数据分析的结果显示,分离后信号的相关系数在不同信噪比下有稳定的提高,说明该方法能在一定程度上利用单观测通道在海洋环境噪声背景下分离船舶辐射噪声,实验数据分析同时表明该方法对双目标船的分离也有一定效果.%A method of blind source separation for ship-radiated noise is proposed based on single observing channel.According to the spatial characteristic of ship-radiated noise in far field, a virtual channel is constructed from the observed channel by time delay and data filtering. It overcomes the limitation of channel numbers. The simulation and experimental results using this method are presented. The results show that under the sea ambient noise the shipradiated noise can be separated by this method with single observing channel, and it is effective for two object ships to be separated.
Amirov, R. Kh.; Vorona, N. A.; Gavrikov, A. V.; Liziakin, G. D.; Polistchook, V. P.; Samoylov, I. S.; Smirnov, V. P.; Usmanov, R. A.; Yartsev, I. M.
2015-12-01
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics.
Energy Technology Data Exchange (ETDEWEB)
Amirov, R. Kh.; Vorona, N. A.; Gavrikov, A. V.; Liziakin, G. D.; Polistchook, V. P.; Samoylov, I. S.; Smirnov, V. P.; Usmanov, R. A., E-mail: ravus46@yandex.ru; Yartsev, I. M. [Russian Academy of Sciences, Joint Institute for High Temperatures (Russian Federation)
2015-12-15
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics.
International Nuclear Information System (INIS)
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics
MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES
Directory of Open Access Journals (Sweden)
H. Sadeq
2016-06-01
Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Merging Digital Surface Models Implementing Bayesian Approaches
Sadeq, H.; Drummond, J.; Li, Z.
2016-06-01
In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Huang, Pei; Mukherji, Sachiyo T; Wu, Sha; Muller, James; Goel, Ramesh
2016-10-15
Recently, research on source separation followed by the treatment of urine and/or resource recovery from human urine has shown promise as an emerging management strategy. Despite contributing only 1% of the total volume of wastewater, human urine contributes about 80% of the nitrogen, 70% of the potassium, and up to 50% of the total phosphorus in wastewater. It is also a known fact that many of the micropollutants, especially selected estrogens, get into municipal wastewater through urine excretion. In this research, we investigated the fate of 17β-estradiol (E2) as a model estrogen during struvite precipitation from synthetic urine followed by the treatment of urine using a partial nitritation-anammox (PN/A) system. Single-stage and two-stage suspended growth PN/A configurations were used to remove the nitrogen in urine after struvite precipitation. The results showed an almost 95% phosphorous and 5% nitrogen recovery/removal from the synthetic urine due to struvite precipitation. The single and two stage PN/A processes were able to remove around 50% and 75% of ammonia and nitrogen present in the post struvite urine solution, respectively. After struvite precipitation, more than 95% of the E2 remained in solution and the transformation of E2 to E1 happened during urine storage. Most of the E2 removal that occurred during the PN/A process was due to sorption on the biomass and biodegradation (transformation of E2 to E1, and slow degradation of E1 to other metabolites). These results demonstrate that a combination of chemical and biological unit processes will be needed to recover and manage nutrients in source separated urine. PMID:27566951
Bayesian Classification of Image Structures
DEFF Research Database (Denmark)
Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert
2009-01-01
In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...
Bayesian Agglomerative Clustering with Coalescents
Teh, Yee Whye; Daumé III, Hal; Roy, Daniel
2009-01-01
We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.
Bayesian NL interpretation and learning
H. Zeevat
2011-01-01
Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language
Differentiated Bayesian Conjoint Choice Designs
Z. Sándor (Zsolt); M. Wedel (Michel)
2003-01-01
textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
3-D contextual Bayesian classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian Analysis of Experimental Data
Directory of Open Access Journals (Sweden)
Lalmohan Bhar
2013-10-01
Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Bayesian analysis of rare events
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...
Bayesian calibration of simultaneity in audiovisual temporal order judgments.
Directory of Open Access Journals (Sweden)
Shinya Yamamoto
Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.
Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods
Zhu, Weixuan
2016-01-01
The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...
Bayesian versus 'plain-vanilla Bayesian' multitarget statistics
Mahler, Ronald P. S.
2004-08-01
Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."
A Bayesian Framework for SNP Identification
Energy Technology Data Exchange (ETDEWEB)
Webb-Robertson, Bobbie-Jo M.; Havre, Susan L.; Payne, Deborah A.
2005-07-01
Current proteomics techniques, such as mass spectrometry, focus on protein identification, usually ignoring most types of modifications beyond post-translational modifications, with the assumption that only a small number of peptides have to be matched to a protein for a positive identification. However, not all proteins are being identified with current techniques and improved methods to locate points of mutation are becoming a necessity. In the case when single-nucleotide polymorphisms (SNPs) are observed, brute force is the most common method to locate them, quickly becoming computationally unattractive as the size of the database associated with the model organism grows. We have developed a Bayesian model for SNPs, BSNP, incorporating evolutionary information at both the nucleotide and amino acid levels. Formulating SNPs as a Bayesian inference problem allows probabilities of interest to be easily obtained, for example the probability of a specific SNP or specific type of mutation over a gene or entire genome. Three SNP databases were observed in the evaluation of the BSNP model; the first SNP database is a disease specific gene in human, hemoglobin, the second is also a disease specific gene in human, p53, and the third is a more general SNP database for multiple genes in mouse. We validate that the BSNP model assigns higher posterior probabilities to the SNPs defined in all three separate databases than can be attributed to chance under specific evolutionary information, for example the amino acid model described by Majewski and Ott in conjunction with either the four-parameter nucleotide model by Bulmer or seven-parameter nucleotide model by Majewski and Ott.
A Bayesian Estimator of Protein-Protein Association Probabilities
Energy Technology Data Exchange (ETDEWEB)
Gilmore, Jason M.; Auberry, Deanna L.; Sharp, Julia L.; White, Amanda M.; Anderson, Kevin K.; Daly, Don S.
2008-07-01
The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein pull-down LC-MS assay experiments. BEPro3 is open source software that runs on both Windows XP and Mac OS 10.4 or newer versions, and is freely available from http://www.pnl.gov/statistics/BEPro3.
Dean, Lee; Kwon, Ye Jin; Philpott, M Katherine; Stanciu, Cristina E; Seashols-Williams, Sarah J; Dawson Cruz, Tracey; Sturgill, Jamie; Ehrhardt, Christopher J
2015-07-01
Analysis of biological mixtures is a significant problem for forensic laboratories, particularly when the mixture contains only one cell type. Contributions from multiple individuals to biologic evidence can complicate DNA profile interpretation and often lead to a reduction in the probative value of DNA evidence or worse, its total loss. To address this, we have utilized an analytical technique that exploits the intrinsic immunological variation among individuals to physically separate cells from different sources in a mixture prior to DNA profiling. Specifically, we applied a fluorescently labeled antibody probe to selectively bind to one contributor in a mixture through allele-specific interactions with human leukocyte antigen (HLA) proteins that are expressed on the surfaces of most nucleated cells. Once the contributor's cells were bound to the probe, they were isolated from the mixture using fluorescence activated cell sorting (FACS)-a high throughput technique for separating cell populations based on their optical properties-and then subjected to STR analysis. We tested this approach on two-person and four-person whole blood mixtures where one contributor possessed an HLA allele (A*02) that was not shared by other contributors to the mixture. Results showed that hybridization of the mixture with a fluorescently-labeled antibody probe complimentary to the A*02 allele's protein product created a cell population with a distinct optical profile that could be easily differentiated from other cells in the mixture. After sorting the cells with FACS, genetic analysis showed that the STR profile of this cell population was consistent with that of the contributor who possessed the A*02 allele. Minor peaks from the A*02 negative contributor(s) were observed but could be easily distinguished from the profile generated from A*02 positive cells. Overall, this indicates that HLA antibody probes coupled to FACS may be an effective approach for generating STR profiles of
Bayesian decision making in human collectives with binary choices
Eguíluz, Víctor M; Fernández-Gracia, J
2015-01-01
Here we focus on the description of the mechanisms behind the process of information aggregation and decision making, a basic step to understand emergent phenomena in society, such as trends, information spreading or the wisdom of crowds. In many situations, agents choose between discrete options. We analyze experimental data on binary opinion choices in humans. The data consists of two separate experiments in which humans answer questions with a binary response, where one is correct and the other is incorrect. The questions are answered without and with information on the answers of some previous participants. We find that a Bayesian approach captures the probability of choosing one of the answers. The influence of peers is uncorrelated with the difficulty of the question. The data is inconsistent with Weber's law, which states that the probability of choosing an option depends on the proportion of previous answers choosing that option and not on the total number of those answers. Last, the present Bayesian ...
Methods for Bayesian power spectrum inference with galaxy surveys
Jasche, Jens
2013-01-01
We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves over previous Bayesian methods by performing a joint inference of the three dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate sub samples. The method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal to noise regimes by using a determini...
Institute of Scientific and Technical Information of China (English)
王彪; 朱志慧; 戴跃伟
2016-01-01
现有的基于CS-MMV（Compressed Sensing-Multiple Measurement Vectors）模型的DOA估计一般都假定信号源为独立同分布（ i．i．d），算法建立在信号的空间结构上进行分析，而当处理具有时序结构的源信号时表现出性能和鲁棒性差的问题，为此该文提出一种具有时序结构的稀疏贝叶斯学习的DOA算法，该方法通过建立一阶自回归过程（ AR）来描述具有时序结构的水声信号，将信号源的时间结构特性充分应用到DOA估计模型中，然后采用针对多测量矢量的稀疏贝叶斯学习（ Muti-vectors Sparse Bayesian Learning ）算法重构信号空间谱，建立多重测量向量中恢复未知稀疏源的信号的CS（ Compressed Sensing ）模型，最终完成DOA估计．仿真结果表明该方法相对于传统的算法具有更高的空间分辨率和估计精度的特点，且抗干扰能力强．%Assuming independently but identically distributed sources,most existing DOA algorithms based on the CS-MMV model are analyzed according to the spatial structure of the signals.The temporal correlation between the sources,how-ever,results in poor performance and robustness.To overcome this problem,we propose a DOA estimation algorithm based on Sparse Bayesian Learning ( SBL) with temporally correlated source vectors.In this method,an underwater acoustic source is regarded as a first-order autoregressive process,with time structure characteristics being applied to DOA estimation model.Af-ter that,the multi-vector SBL algorithm is used to reconstruct the signal spatial spectrum.Then the CS-MMV model of the un-known sparse vector signal sources is established to estimate the DOA.Through simulation,it shows that the proposed algo-rithm provides a higher spatial resolution and estimation accuracy in comparison to many other current algorithms.
Bayesian inference on proportional elections.
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
Bayesian approach to rough set
Marwala, Tshilidzi
2007-01-01
This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.
Bayesian priors for transiting planets
Kipping, David M
2016-01-01
As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...
Bayesian Inference for Radio Observations
Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin
2015-01-01
(Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...
A Bayesian Nonparametric IRT Model
Karabatsos, George
2015-01-01
This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...
Elements of Bayesian experimental design
Energy Technology Data Exchange (ETDEWEB)
Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)
1997-09-01
We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.
Bayesian Network--Response Regression
WANG, LU; Durante, Daniele; Dunson, David B.
2016-01-01
There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Mohammad-Djafari, Ali
2007-01-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali
2004-11-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian analysis of contingency tables
Gómez Villegas, Miguel A.; González Pérez, Beatriz
2005-01-01
The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...
Bayesian estimation of turbulent motion
Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni
2013-01-01
International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...
Bayesian Kernel Mixtures for Counts
Canale, Antonio; David B Dunson
2011-01-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Bayesian second law of thermodynamics.
Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241
Bayesian second law of thermodynamics
Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Andree, Stefan; Reble, Carina; Helfmann, Jurgen; Gersonde, Ingo; Illing, Gerd
2010-01-01
We present a new variant of a noncontact, oblique incidence spatially resolved reflectance setup. The continuously variable source detector separation enables adaptation to high and low albedo samples. Absorption (μ(a)) and reduced scattering coefficients (μ(') (s)) are determined in the wavelength range of 400-1000 nm using a lookup table, calculated by a Monte Carlo simulation of the light transport. The method is characterized by an silicone phantom study covering a wide parameter range 0.01 mm(-1) ≤ μ(a) ≤ 2.5 mm(-1) and 0.2 mm(-1) ≤ μ(') (s) ≤ 10 mm(-1), which includes the optical parameters of tissue in the visible and near infrared. The influence of the incident angle and the detection aperture on the simulated remission was examined. Using perpendicular incidence and 90-deg detection aperture in the Monte Carlo simulation in contrast to the experimental situation with 30-deg incidence and 4.6-deg detection aperture is shown to be valid for the parameter range μ(') (s) > 1 mm(-1) and μ(a) coefficient for increasing absorption can be the consequence of real physics instead of cross talk. PMID:21198213
Separation of multiple evoked responses using differential amplitude and latency variability
Knuth, K H; Bressler, S L; Ding, M; Knuth, Kevin H.; Truccolo, Wilson A.; Bressler, Steven L.; Ding, Mingzhou
2001-01-01
In neuroelectrophysiology one records electric potentials or magnetic fields generated by ensembles of synchronously active neurons in response to externally presented stimuli. These evoked responses are often produced by multiple generators in the presence of ongoing background activity. While source localization techniques or current source density estimation are usually used to identify generators, application of blind source separation techniques to obtain independent components has become more popular. We approach this problem by applying the Bayesian methodology to a more physiologically-realistic source model. As it is generally accepted that single trials vary in amplitude and latency, we incorporate this variability into the model. Rather than making the unrealistic assumption that these cortical components are independent of one another, our algorithm utilizes the differential amplitude and latency variability of the evoked waveforms to identify the cortical components. The algorithm is applied to i...
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...
Bayesian Posterior Distributions Without Markov Chains
Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.
2012-01-01
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...
Bayesian models for comparative analysis integrating phylogenetic uncertainty
Directory of Open Access Journals (Sweden)
Villemereuil Pierre de
2012-06-01
Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible
A Comparison of BBN, ADTree and MLP in separating Quasars from Large Survey Catalogues
Institute of Scientific and Technical Information of China (English)
Yan-Xia Zhang; Yong-Heng Zhao
2007-01-01
We compare the performance of Bayesian Belief Networks (BBN), Multilayer Perception (MLP) networks and Alternating Decision Trees (ADtree) on separating quasars from stars with the database from the 2MASS and FIRST survey catalogs. Having a training sample of sources of known object types, the classifiers are trained to separate quasars from stars. By the statistical properties of the sample, the features important for classification are selected. We compare the classification results with and without feature selection.Experiments show that the results with feature selection are better than those without feature selection. From the high accuracy found, it is concluded that these automated methods are robust and effective for classifying point sources. They may all be applied to large survey projects (e.g. selecting input catalogs) and for other astronomical issues, such as the parameter measurement of stars and the redshift estimation of galaxies and quasars.
Variational bayesian method of estimating variance components.
Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi
2016-07-01
We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.
SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE
Institute of Scientific and Technical Information of China (English)
Ming HAN; Yuanyao DING
2004-01-01
This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.
Directory of Open Access Journals (Sweden)
Futoshi Asano
2004-09-01
Full Text Available A method of detecting speech events in a multiple-sound-source condition using audio and video information is proposed. For detecting speech events, sound localization using a microphone array and human tracking by stereo vision is combined by a Bayesian network. From the inference results of the Bayesian network, information on the time and location of speech events can be known. The information on the detected speech events is then utilized in the robust speech interface. A maximum likelihood adaptive beamformer is employed as a preprocessor of the speech recognizer to separate the speech signal from environmental noise. The coefficients of the beamformer are kept updated based on the information of the speech events. The information on the speech events is also used by the speech recognizer for extracting the speech segment.
Dust SEDs in the era of Herschel and Planck: a Hierarchical Bayesian fitting technique
Kelly, Brandon C; Stutz, Amelia M; Kauffmann, Jens; Goodman, Alyssa A; Launhardt, Ralf
2012-01-01
We present a hierarchical Bayesian method for fitting infrared spectral energy distributions (SEDs) of dust emission to observed fluxes. Under the standard assumption of optically thin single temperature (T) sources the dust SED as represented by a power-law modified black body is subject to a strong degeneracy between T and the spectral index beta. The traditional non-hierarchical approaches, typically based on chi-square minimization, are severely limited by this degeneracy, as it produces an artificial anti-correlation between T and beta even with modest levels of observational noise. The hierarchical Bayesian method rigorously and self-consistently treats measurement uncertainties, including calibration and noise, resulting in more precise SED fits. As a result, the Bayesian fits do not produce any spurious anti-correlations between the SED parameters due to measurement uncertainty. We demonstrate that the Bayesian method is substantially more accurate than the chi-square fit in recovering the SED paramet...
Application of Wavelet Denoising Algorithm in Noisy Blind Source Separation%小波去噪算法在含噪盲源分离中的应用
Institute of Scientific and Technical Information of China (English)
吴微; 彭华; 王彬
2015-01-01
Blind source separation (BSS) algorithms based on the noise‐free model are not applicable when the SNR is low .To deal with this issue ,one way is to denoise the mixtures corrupted by white Gaussian noise ,firstly ,and then utilize the BSS algorithms .Therefore ,a Waveshrink algorithm is proposed based on translation invariant to denoise mixtures with strong noise .The high‐frequency coefficients sliding window method is utilized to estimate the noise variance accurately ,and BayesShrink algorithm is utilized for a more reasonable threshold .Consequently ,the scope of the translation invariant is narrowed without degrading the performance of denoising ,thus reducing the computation amount .Simulation results indi‐cate that the proposed approach perform better in denoising compared with the traditional Waveshrink al‐gorithm ,and can remarkably enhance the separation performance of BSS algorithms ,especially in the case with low signal SNRs .%无噪模型下的盲源分离算法在信噪比较低的情况下并不适用。针对该情况一种解决方案就是先对含有高斯白噪声的混合信号进行去噪预处理，然后使用盲源分离算法进行分离。为此，本文提出了一种适用于信噪比较低条件下的基于平移不变量的小波去噪算法。该算法首先使用高频系数滑动窗口法准确估计含噪混合信号的噪声方差，然后使用Bayesshrink阈值估计算法得到更加合理的阈值，最后在不降低去噪效果的同时缩小了平移不变量的范围，减少了运算量。实验仿真表明，在信噪比较低的情况下，与传统小波去噪算法相比，该算法可以更加有效地去除噪声，在很大程度上提升盲源分离算法的性能。
Objective Bayesian analysis of "on/off" measurements
Casadei, Diego
2014-01-01
In high-energy astrophysics, it is common practice to account for the background overlaid with the counts from the source of interest with the help of auxiliary measurements carried on by pointing off-source. In this "on/off" measurement, one knows the number of photons detected while pointing to the source, the number of photons collected while pointing away of the source, and how to estimate the background counts in the source region from the flux observed in the auxiliary measurements. For very faint sources, the number of detected photons is so low that the approximations which hold asymptotically are not valid. On the other hand, the analytical solution exists for the Bayesian statistical inference, which is valid at low and high counts. The Bayesian approach to statistical inference provides a probability distribution describing our degree of belief on the possible outcomes of the parameter of interest, over its entire range. In addition to the specification of the model, in this case assumed to obey Po...
Applying Bayesian belief networks in rapid response situations
Energy Technology Data Exchange (ETDEWEB)
Gibson, William L [Los Alamos National Laboratory; Deborah, Leishman, A. [Los Alamos National Laboratory; Van Eeckhout, Edward [Los Alamos National Laboratory
2008-01-01
The authors have developed an enhanced Bayesian analysis tool called the Integrated Knowledge Engine (IKE) for monitoring and surveillance. The enhancements are suited for Rapid Response Situations where decisions must be made based on uncertain and incomplete evidence from many diverse and heterogeneous sources. The enhancements extend the probabilistic results of the traditional Bayesian analysis by (1) better quantifying uncertainty arising from model parameter uncertainty and uncertain evidence, (2) optimizing the collection of evidence to reach conclusions more quickly, and (3) allowing the analyst to determine the influence of the remaining evidence that cannot be obtained in the time allowed. These extended features give the analyst and decision maker a better comprehension of the adequacy of the acquired evidence and hence the quality of the hurried decisions. They also describe two example systems where the above features are highlighted.
Background subtraction and transient timing with Bayesian Blocks
Worpel, Hauke
2015-01-01
Aims: To incorporate background subtraction into the Bayesian Blocks algorithm so that transient events can be timed accurately and precisely even in the presence of a substantial, rapidly variable, background. Methods: We developed several modifications to the algorithm and tested them on a simulated XMM-Newton observation of a bursting and eclipsing object. Results: We found that bursts can be found to good precision for almost all background subtraction methods, but eclipse ingresses and egresses present problems for most methods. We found one method that recovered these events with precision comparable to the interval between individual photons, in which both source and background region photons are combined into a single list and weighted according to the exposure area. We have also found that adjusting the Bayesian Blocks change points nearer to blocks with higher count rate removes a systematic bias towards blocks of low count rate.
bgc: Software for Bayesian estimation of genomic clines.
Gompert, Z; Buerkle, C A
2012-11-01
Introgression in admixed populations can be used to identify candidate loci that might underlie adaptation or reproductive isolation. The Bayesian genomic cline model provides a framework for quantifying variable introgression in admixed populations and identifying regions of the genome with extreme introgression that are potentially associated with variation in fitness. Here we describe the bgc software, which uses Markov chain Monte Carlo to estimate the joint posterior probability distribution of the parameters in the Bayesian genomic cline model and designate outlier loci. This software can be used with next-generation sequence data, accounts for uncertainty in genotypic state, and can incorporate information from linked loci on a genetic map. Output from the analysis is written to an HDF5 file for efficient storage and manipulation. This software is written in C++. The source code, software manual, compilation instructions and example data sets are available under the GNU Public License at http://sites.google.com/site/bgcsoftware/. PMID:22978657
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Wei, Chang; Jerabek, Elihu Calvin; LeBlanc, Jr., Oliver Harris
2001-03-06
An ultracapacitor includes two solid, nonporous current collectors, two porous electrodes separating the collectors, a porous separator between the electrodes and an electrolyte occupying the pores in the electrodes and separator. The electrolyte is a polar aprotic organic solvent and a salt. The porous separator comprises a wet laid cellulosic material.
Bayesian reasoning with ifs and ands and ors
Directory of Open Access Journals (Sweden)
Nicole eCruz
2015-02-01
Full Text Available The Bayesian approach to the psychology of reasoning generalizes binary logic, extending the binary concept of consistency to that of coherence, and allowing the study of deductive reasoning from uncertain premises. Studies in judgment and decision making have found that people's probability judgments can fail to be coherent. We investigated people's coherence further for judgments about conjunctions, disjunctions and conditionals, and asked whether their coherence would increase when they were given the explicit task of drawing inferences. Participants gave confidence judgments about a list of separate statements (the statements group or the statements grouped as explicit inferences (the inferences group. Their responses were generally coherent at above chance levels for all the inferences investigated, regardless of the presence of an explicit inference task. An exception was that they were incoherent in the context known to cause the conjunction fallacy, and remained so even when they were given an explicit inference. The participants were coherent under the assumption that they interpreted the natural language conditional as it is represented in Bayesian accounts of conditional reasoning, but they were incoherent under the assumption that they interpreted the natural language conditional as the material conditional of elementary binary logic. Our results provide further support for the descriptive adequacy of Bayesian reasoning principles in the study of deduction under uncertainty.
Bayesian reasoning with ifs and ands and ors.
Cruz, Nicole; Baratgin, Jean; Oaksford, Mike; Over, David E
2015-01-01
The Bayesian approach to the psychology of reasoning generalizes binary logic, extending the binary concept of consistency to that of coherence, and allowing the study of deductive reasoning from uncertain premises. Studies in judgment and decision making have found that people's probability judgments can fail to be coherent. We investigated people's coherence further for judgments about conjunctions, disjunctions and conditionals, and asked whether their coherence would increase when they were given the explicit task of drawing inferences. Participants gave confidence judgments about a list of separate statements (the statements group) or the statements grouped as explicit inferences (the inferences group). Their responses were generally coherent at above chance levels for all the inferences investigated, regardless of the presence of an explicit inference task. An exception was that they were incoherent in the context known to cause the conjunction fallacy, and remained so even when they were given an explicit inference. The participants were coherent under the assumption that they interpreted the natural language conditional as it is represented in Bayesian accounts of conditional reasoning, but they were incoherent under the assumption that they interpreted the natural language conditional as the material conditional of elementary binary logic. Our results provide further support for the descriptive adequacy of Bayesian reasoning principles in the study of deduction under uncertainty. PMID:25762965
Bayesian decision making in human collectives with binary choices.
Eguíluz, Víctor M; Masuda, Naoki; Fernández-Gracia, Juan
2015-01-01
Here we focus on the description of the mechanisms behind the process of information aggregation and decision making, a basic step to understand emergent phenomena in society, such as trends, information spreading or the wisdom of crowds. In many situations, agents choose between discrete options. We analyze experimental data on binary opinion choices in humans. The data consists of two separate experiments in which humans answer questions with a binary response, where one is correct and the other is incorrect. The questions are answered without and with information on the answers of some previous participants. We find that a Bayesian approach captures the probability of choosing one of the answers. The influence of peers is uncorrelated with the difficulty of the question. The data is inconsistent with Weber's law, which states that the probability of choosing an option depends on the proportion of previous answers choosing that option and not on the total number of those answers. Last, the present Bayesian model fits reasonably well to the data as compared to some other previously proposed functions although the latter sometime perform slightly better than the Bayesian model. The asset of the present model is the simplicity and mechanistic explanation of the behavior. PMID:25867176
Bayesian decision making in human collectives with binary choices.
Directory of Open Access Journals (Sweden)
Víctor M Eguíluz
Full Text Available Here we focus on the description of the mechanisms behind the process of information aggregation and decision making, a basic step to understand emergent phenomena in society, such as trends, information spreading or the wisdom of crowds. In many situations, agents choose between discrete options. We analyze experimental data on binary opinion choices in humans. The data consists of two separate experiments in which humans answer questions with a binary response, where one is correct and the other is incorrect. The questions are answered without and with information on the answers of some previous participants. We find that a Bayesian approach captures the probability of choosing one of the answers. The influence of peers is uncorrelated with the difficulty of the question. The data is inconsistent with Weber's law, which states that the probability of choosing an option depends on the proportion of previous answers choosing that option and not on the total number of those answers. Last, the present Bayesian model fits reasonably well to the data as compared to some other previously proposed functions although the latter sometime perform slightly better than the Bayesian model. The asset of the present model is the simplicity and mechanistic explanation of the behavior.
Bayesian Concordance Correlation Coefficient with Application to Repeatedly Measured Data
Directory of Open Access Journals (Sweden)
Atanu BHATTACHARJEE
2015-10-01
Full Text Available Objective: In medical research, Lin's classical concordance correlation coefficient (CCC is frequently applied to evaluate the similarity of the measurements produced by different raters or methods on the same subjects. It is particularly useful for continuous data. The objective of this paper is to propose the Bayesian counterpart to compute CCC for continuous data. Material and Methods: A total of 33 patients of astrocytoma brain treated in the Department of Radiation Oncology at Malabar Cancer Centre is enrolled in this work. It is a continuous data of tumor volume and tumor size repeatedly measured during baseline pretreatment workup and post surgery follow-ups for all patients. The tumor volume and tumor size are measured separately by MRI and CT scan. The agreement of measurement between MRI and CT scan is calculated through CCC. The statistical inference is performed through Markov Chain Monte Carlo (MCMC technique. Results: Bayesian CCC is found suitable to get prominent evidence for test statistics to explore the relation between concordance measurements. The posterior mean estimates and 95% credible interval of CCC on tumor size and tumor volume are observed with 0.96(0.87,0.99 and 0.98(0.95,0.99 respectively. Conclusion: The Bayesian inference is adopted for development of the computational algorithm. The approach illustrated in this work provides the researchers an opportunity to find out the most appropriate model for specific data and apply CCC to fulfill the desired hypothesis.
Separating Underdetermined Convolutive Speech Mixtures
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan;
2006-01-01
a method for underdetermined blind source separation of convolutive mixtures. The proposed framework is applicable for separation of instantaneous as well as convolutive speech mixtures. It is possible to iteratively extract each speech signal from the mixture by combining blind source separation...
Brenna, Marco; Németh, Károly; Cronin, Shane J.; Sohn, Young Kwan; Smith, Ian E. M.; Wijbrans, Jan
2015-05-01
New eruptions in monogenetic volcanic fields conceptually occur independently of previous ones. In some instances, however, younger volcanic structures and vents may overlap with older edifices. The genetic links between such co-located eruptions remain unclear. We mapped and analysed the stratigraphic relationships between eruptive units on the 400 × 900-m island of Chagwido off the western coast of Jeju Island, a Pleistocene to Holocene intraplate volcanic field. Chagwido consists of an eastern, older tuff ring with a nested scoria cone and a western tuff, scoria and lava flow sequence. The two stratigraphic packages are separated by a prominent paleosol. The East-Chagwido tuff and scoria deposits were eroded and a period of intense weathering and soil development occurred, before a subsequent West-Chagwido tuff ring and scoria cone and lava complex was erupted. The two eruptions were fed by three chemically distinct magmas. The older eastern eruption consists of magma with composition transitional between high-Al alkalic basalt and low-Al alkalic basalt and has stratigraphic characteristics, composition and syn-eruptive trends akin to the neighbouring Dangsanbong tuff cone. This magma type is typical for the transitional stage from high-Al alkalic (pre 500 ka) to low-Al alkalic (post 250 ka) identified for the greater Jeju volcanic system. The East-Chagwido volcanic complex thus formed as the westernmost in a chain of three volcanoes along a fissure system, with a small volcanic remnant island Wado 1 km to the east and the large Dangsanbong tuff cone another 1 km eastward. A new Ar/Ar age of 446 ± 22 ka for Dangsanbong likely characterizes the age of the whole chain. The second, West-Chagwido eruption started with low-Al alkalic basalt forming a phreatomagmatic phase and ended with subalkalic basalt forming a scoria cone and lava flows. The occurrence of subalkalic lavas is known across Jeju to have started only at ~250 ka, and thus, the well
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Bayesian Query-Focused Summarization
Daumé, Hal
2009-01-01
We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...
Seeded Bayesian Networks: Constructing genetic networks from microarray data
Directory of Open Access Journals (Sweden)
Quackenbush John
2008-07-01
Full Text Available Abstract Background DNA microarrays and other genomics-inspired technologies provide large datasets that often include hidden patterns of correlation between genes reflecting the complex processes that underlie cellular metabolism and physiology. The challenge in analyzing large-scale expression data has been to extract biologically meaningful inferences regarding these processes – often represented as networks – in an environment where the datasets are often imperfect and biological noise can obscure the actual signal. Although many techniques have been developed in an attempt to address these issues, to date their ability to extract meaningful and predictive network relationships has been limited. Here we describe a method that draws on prior information about gene-gene interactions to infer biologically relevant pathways from microarray data. Our approach consists of using preliminary networks derived from the literature and/or protein-protein interaction data as seeds for a Bayesian network analysis of microarray results. Results Through a bootstrap analysis of gene expression data derived from a number of leukemia studies, we demonstrate that seeded Bayesian Networks have the ability to identify high-confidence gene-gene interactions which can then be validated by comparison to other sources of pathway data. Conclusion The use of network seeds greatly improves the ability of Bayesian Network analysis to learn gene interaction networks from gene expression data. We demonstrate that the use of seeds derived from the biomedical literature or high-throughput protein-protein interaction data, or the combination, provides improvement over a standard Bayesian Network analysis, allowing networks involving dynamic processes to be deduced from the static snapshots of biological systems that represent the most common source of microarray data. Software implementing these methods has been included in the widely used TM4 microarray analysis package.
BEAST: Bayesian evolutionary analysis by sampling trees
Directory of Open Access Journals (Sweden)
Drummond Alexei J
2007-11-01
Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
Bayesian credible interval construction for Poisson statistics
Institute of Scientific and Technical Information of China (English)
ZHU Yong-Sheng
2008-01-01
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.
Modeling Diagnostic Assessments with Bayesian Networks
Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego
2007-01-01
This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…
Advances in Bayesian Modeling in Educational Research
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…