Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin
2013-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...
Low Complexity Bayesian Single Channel Source Separation
Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole
We propose a simple Bayesian model for performing single channel speech separation using factorized source priors in a sliding window linearly transformed domain. Using a one dimensional mixture of Gaussians to model each band source leads to fast tractable inference for the source signals....... Simulations with separation of a male and a female speaker using priors trained on the same speakers show comparable performance with the blind separation approach of G.-J. Jang and T.-W. Lee (see NIPS, vol.15, 2003) with a SNR improvement of 4.9 dB for both the male and female speaker. Mixing coefficients...... keeping the complexity low using machine learning and CASA (computational auditory scene analysis) approaches (Jang and Lee, 2003; Roweis, S.T., 2001; Wang, D.L. and Brown, G.J., 1999; Hu, G. and Wang, D., 2003)....
Sparsity in Bayesian Blind Source Separation and Deconvolution
Šmídl, Václav; Tichý, Ondřej
Berlin Heidelberg: Springer, 2013, s. 548-563. (Lecture Notes in Computer Science. vol. 8189. part II). ISBN 978-3-642-40990-5. ISSN 0302-9743. [The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD 2013). Praha (CZ), 24.09.2013-26.09.2013] R&D Projects: GA ČR GA13-29225S Keywords : Blind Source Separation * Deconvolution * Sparsity * Scintigraphy Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/AS/tichy-sparsity in bayesian blind source separation and deconvolution.pdf
Bayesian Blind Source Separation of Positive Non Stationary Sources
Ichir, Mahieddine M.; Mohammad-Djafari, Ali
2004-11-01
In this contribution, we address the problem of blind non negative source separation. This problem finds its application in many fields of data analysis. We propose herein a novel approach based on Gamma mixture probability priors: Gamma densities to constraint the unobserved sources to lie on the positive half plane; a mixture density with a first order Markov model on the associated hidden variables to account for eventual non stationarity on the sources. Posterior mean estimates are obtained via appropriate Monte Carlo Markov Chain sampling.
Reconstruction of Zeff profiles at TEXTOR through Bayesian source separation
We describe a work in progress on the reconstruction of radial profiles for the ion effective charge Zeff on the TEXTOR tokamak, using statistical data analysis techniques. We introduce our diagnostic for the measurement of Bremsstrahlung emissivity signals. Zeff profiles can be determined by Abel inversion of line-integrated measurements of the Bremsstrahlung emissivity (εff) from the plasma and the plasma electron density (ne) and temperature (Te). However, at the plasma edge only estimated values are routinely used for ne and Te, which are moreover determined at different toroidal locations. These various uncertainties hinder the interpretation of a Zeff profile outside the central plasma. In order to circumvent this problem, we propose several scenarios meant to allow the extraction by (Bayesian) Blind Source Separation techniques of either (line-integrated) Zeff wave shapes or absolutely calibrated signals from (line-integrated) emissivity signals, using also density and temperature signals, as required. (authors)
Bayesian Blind Source Separation with Unknown Prior Covariance
Tichý, Ondřej; Šmídl, Václav
Cham : Springer, 2015 - (Vincent, E.; Yeredor, A.; Koldovský, Z.; Tichavský, P.), s. 352-359 ISBN 978-3-319-22481-7. ISSN 0302-9743. - (Lecture Notes in Computer Science. 9237). [12th International Conference on Latent Variable Analysis and Signal Separation. Liberec (CZ), 25.08.2015-28.08.2015] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Blind source separation * Covariance model * Variational Bayes approximation * Non-negative matrix factorization Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/tichy-0447092.pdf
Bayesian Source Separation Applied to Identifying Complex Organic Molecules in Space
Knuth, Kevin H; Choinsky, Joshua; Maunu, Haley A; Carbon, Duane F
2014-01-01
Emission from a class of benzene-based molecules known as Polycyclic Aromatic Hydrocarbons (PAHs) dominates the infrared spectrum of star-forming regions. The observed emission appears to arise from the combined emission of numerous PAH species, each with its unique spectrum. Linear superposition of the PAH spectra identifies this problem as a source separation problem. It is, however, of a formidable class of source separation problems given that different PAH sources potentially number in the hundreds, even thousands, and there is only one measured spectral signal for a given astrophysical site. Fortunately, the source spectra of the PAHs are known, but the signal is also contaminated by other spectral sources. We describe our ongoing work in developing Bayesian source separation techniques relying on nested sampling in conjunction with an ON/OFF mechanism enabling simultaneous estimation of the probability that a particular PAH species is present and its contribution to the spectrum.
Convergent Bayesian formulations of blind source separation and electromagnetic source estimation
Knuth, Kevin H.; Vaughan Jr, Herbert G.
2015-01-01
We consider two areas of research that have been developing in parallel over the last decade: blind source separation (BSS) and electromagnetic source estimation (ESE). BSS deals with the recovery of source signals when only mixtures of signals can be obtained from an array of detectors and the only prior knowledge consists of some information about the nature of the source signals. On the other hand, ESE utilizes knowledge of the electromagnetic forward problem to assign source signals to th...
On Sparsity in Bayesian Blind Source Separation for Dynamic Medical Imaging
Tichý, Ondřej
Praha : Katedra metematiky, FSv ČVUT, 2014, s. 20-21. [Rektorysova Soutěž. Praha (CZ), 3.12.2014] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * dynamic medical imaging * sparsity constraint Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/tichy-0436843.pdf
Estimation of Input Function from Dynamic PET Brain Data Using Bayesian Blind Source Separation
Tichý, Ondřej; Šmídl, Václav
2015-01-01
Roč. 12, č. 4 (2015), s. 1273-1287. ISSN 1820-0214 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * Variational Bayes method * dynamic PET * input function * deconvolution Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.477, year: 2014 http://library.utia.cas.cz/separaty/2015/AS/tichy-0450509.pdf
Bayesian kinematic earthquake source models
Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.
2009-12-01
Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high
Single channel signal component separation using Bayesian estimation
Cai Quanwei; Wei Ping; Xiao Xianci
2007-01-01
A Bayesian estimation method to separate multicomponent signals with single channel observation is presented in this paper. By using the basis function projection, the component separation becomes a problem of limited parameter estimation. Then, a Bayesian model for estimating parameters is set up. The reversible jump MCMC (Monte Carlo Markov Chain) algorithmis adopted to perform the Bayesian computation. The method can jointly estimate the parameters of each component and the component number. Simulation results demonstrate that the method has low SNR threshold and better performance.
Informed source separation: source coding meets source separation
Ozerov, Alexey; Liutkus, Antoine; Badeau, Roland; Richard, Gaël
2011-01-01
We consider the informed source separation (ISS) problem where, given the sources and the mixtures, any kind of side-information can be computed during a so-called encoding stage. This side-information is then used to assist source separation, given the mixtures only, at the so-called decoding stage. State of the art ISS approaches do not really consider ISS as a coding problem and rely on some purely source separation-inspired strategies, leading to performances that can at best reach those ...
A Bayesian approach to earthquake source studies
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also
Bayesian Kinematic Finite Fault Source Models (Invited)
Minson, S. E.; Simons, M.; Beck, J. L.
2010-12-01
Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.
A Bayesian method for microseismic source inversion
Pugh, D. J.; White, R. S.; Christie, P. A. F.
2016-08-01
Earthquake source inversion is highly dependent on location determination and velocity models. Uncertainties in both the model parameters and the observations need to be rigorously incorporated into an inversion approach. Here, we show a probabilistic Bayesian method that allows formal inclusion of the uncertainties in the moment tensor inversion. This method allows the combination of different sets of far-field observations, such as P-wave and S-wave polarities and amplitude ratios, into one inversion. Additional observations can be included by deriving a suitable likelihood function from the uncertainties. This inversion produces samples from the source posterior probability distribution, including a best-fitting solution for the source mechanism and associated probability. The inversion can be constrained to the double-couple space or allowed to explore the gamut of moment tensor solutions, allowing volumetric and other non-double-couple components. The posterior probability of the double-couple and full moment tensor source models can be evaluated from the Bayesian evidence, using samples from the likelihood distributions for the two source models, producing an estimate of whether or not a source is double-couple. Such an approach is ideally suited to microseismic studies where there are many sources of uncertainty and it is often difficult to produce reliability estimates of the source mechanism, although this can be true of many other cases. Using full-waveform synthetic seismograms, we also show the effects of noise, location, network distribution and velocity model uncertainty on the source probability density function. The noise has the largest effect on the results, especially as it can affect other parts of the event processing. This uncertainty can lead to erroneous non-double-couple source probability distributions, even when no other uncertainties exist. Although including amplitude ratios can improve the constraint on the source probability
Bayesian Separation of Non-Stationary Mixtures of Dependent Gaus
National Aeronautics and Space Administration — In this work, we propose a novel approach to perform Dependent Component Analysis (DCA). DCA can be thought as the separation of latent, dependent sources from...
Convolutive Blind Source Separation Methods
Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik;
2008-01-01
During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources from...
Le Cam, Steven; Caune, Vairis; Ranta, Radu; Korats, Gundars; Louis-Dorr, Valerie
2015-08-01
The brain source localization problem has been extensively studied in the past years, yielding a large panel of methodologies, each bringing their own strengths and weaknesses. Combining several of these approaches might help in enhancing their respective performance. Our study is carried out in the particular context of intracranial recordings, with the objective to explain the measurements based on a reduced number of dipolar activities. We take benefit of the sparse nature of the Bayesian approaches to separate the noise from the source space, and to distinguish between several source contributions on the electrodes. This first step provides accurate estimates of the dipole projections, which can be used as an entry to an equivalent current dipole fitting procedure. We demonstrate on simulations that the localization results are significantly enhanced by this post-processing step when up to five dipoles are activated simultaneously. PMID:26736344
A localization model to localize multiple sources using Bayesian inference
Dunham, Joshua Rolv
Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).
Source separation as an exercise in logical induction
Knuth, Kevin H.
2002-01-01
We examine the relationship between the Bayesian and information-theoretic formulations of source separation algorithms. This work makes use of the relationship between the work of Claude E. Shannon and the "Recent Contributions" by Warren Weaver (Shannon & Weaver 1949) as clarified by Richard T. Cox (1979) and expounded upon by Robert L. Fry (1996) as a duality between a logic of assertions and a logic of questions. Working with the logic of assertions requires the use of probability as a me...
Compressing Data by Source Separation
Schmidt, A.; Tréguier, E.; Schmidt, F.; Moussaoui, S.
2012-04-01
We interpret source separation of hyperspectral data as a way of applying lossy compressing. In settings where datacubes can be interpreted as a linear combination of source spectra and their abundances and the number of sources is small, we try to quantify the trade-offs and the benefits of source separation and its implementation with non-negative source factorisation. While various methods to implement non-negative matrix factorisation have been used successfully for factoring hyperspectral images into physically meaningful sources which linearly combine to an approximation of the original image. This is useful for modelling the processes which make up the image. At the same time, the approximation opens up the potential for a significant reduction of the data by keeping only the sources and their corresponding abundances, instead of the original complete data cube. This presentation will try to explore the potential of the idea and also to establish limits of its use. Formally, the setting is as follows: we consider P pixels of a hyperspectral image which are acquired at L frequency bands and which are represented as a PxL data matrix X. Each row of this matrix represents a spectrum at a pixel with spatial index p=1..P; this implies that the original topology may be disregarded. Since we work under the assumption of linear mixing, the p-th spectrum, 1<=p<=P, can be expressed as a linear combination of r, 1<=r<=R, source spectra. Thus, X=AxS+E, E being an error matrix to be minimised, and X, A, and S only have non-negative entries. The rows of matrix S are the estimations of the R source spectra, and each entry of A expresses the contribution of the r-th component to the pixel with spatial index p. There are applications where we may interpret the rows of S as physical sources which can be combined using the columns of A to approximate the original data. If the source signals are few and strong (but not even necessarily meaningful), the data volume that has to
A Bayesian analysis of regularised source inversions in gravitational lensing
Suyu, S H; Hobson, M P; Marshall, P J
2006-01-01
Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...
A Bayesian Approach to Detection of Small Low Emission Sources
Xun, Xiaolei; Carroll, Raymond J; Kuchment, Peter
2011-01-01
The article addresses the problem of detecting presence and location of a small low emission source inside of an object, when the background noise dominates. This problem arises, for instance, in some homeland security applications. The goal is to reach the signal-to-noise ratio (SNR) levels on the order of $10^{-3}$. A Bayesian approach to this problem is implemented in 2D. The method allows inference not only about the existence of the source, but also about its location. We derive Bayes factors for model selection and estimation of location based on Markov Chain Monte Carlo (MCMC) simulation. A simulation study shows that with sufficiently high total emission level, our method can effectively locate the source.
Dirichlet Methods for Bayesian Source Detection in Radio Astronomy Images
Friedlander, A. M.
2014-02-01
The sheer volume of data to be produced by the next generation of radio telescopes - exabytes of data on hundreds of millions of objects - makes automated methods for the detection of astronomical objects ("sources") essential. Of particular importance are low surface brightness objects, which are not well found by current automated methods. This thesis explores Bayesian methods for source detection that use Dirichlet or multinomial models for pixel intensity distributions in discretised radio astronomy images. A novel image discretisation method that incorporates uncertainty about how the image should be discretised is developed. Latent Dirichlet allocation - a method originally developed for inferring latent topics in document collections - is used to estimate source and background distributions in radio astronomy images. A new Dirichlet-multinomial ratio, indicating how well a region conforms to a well-specified model of background versus a loosely-specified model of foreground, is derived. Finally, latent Dirichlet allocation and the Dirichlet-multinomial ratio are combined for source detection in astronomical images. The methods developed in this thesis perform source detection well in comparison to two widely-used source detection packages and, importantly, find dim sources not well found by other algorithms.
Nitrate source apportionment in a subtropical watershed using Bayesian model
Yang, Liping; Han, Jiangpei; Xue, Jianlong; Zeng, Lingzao [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Shi, Jiachun, E-mail: jcshi@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Wu, Laosheng, E-mail: laowu@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Jiang, Yonghai [State Key Laboratory of Environmental Criteria and Risk Assessment, Chinese Research Academy of Environmental Sciences, Beijing, 100012 (China)
2013-10-01
Nitrate (NO{sub 3}{sup −}) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO{sub 3}{sup −} concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L{sup −1}) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L{sup −1}). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L{sup −1} NO{sub 3}{sup −}. Four sources of NO{sub 3}{sup −} (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl{sup −}, NO{sub 3}{sup −}, HCO{sub 3}{sup −}, SO{sub 4}{sup 2−}, Ca{sup 2+}, K{sup +}, Mg{sup 2+}, Na{sup +}, dissolved oxygen (DO)] and dual isotope approach (δ{sup 15}N–NO{sub 3}{sup −} and δ{sup 18}O–NO{sub 3}{sup −}). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO{sub 3}{sup −} to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO{sub 3}{sup −}, better
Nitrate source apportionment in a subtropical watershed using Bayesian model
Nitrate (NO3−) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO3− concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L−1) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L−1). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L−1 NO3−. Four sources of NO3− (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl−, NO3−, HCO3−, SO42−, Ca2+, K+, Mg2+, Na+, dissolved oxygen (DO)] and dual isotope approach (δ15N–NO3− and δ18O–NO3−). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO3− to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO3−, better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds. - Highlights: • Nitrate concentration in water displayed
Bayesian Source Attribution of Salmonellosis in South Australia.
Glass, K; Fearnley, E; Hocking, H; Raupach, J; Veitch, M; Ford, L; Kirk, M D
2016-03-01
Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia. PMID:26133008
Fast Bayesian optimal experimental design for seismic source inversion
Long, Quan
2015-07-01
We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L
Blind source separation dependent component analysis
Xiang, Yong; Yang, Zuyuan
2015-01-01
This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.
Improved Bayesian Infrasonic Source Localization for regional infrasound
Blom, Philip S.; Marcillo, Omar; Arrowsmith, Stephen J.
2015-12-01
The mathematical framework used in the Bayesian Infrasonic Source Localization (BISL) methodology is examined and simplified providing a generalized method of estimating the source location and time for an infrasonic event. The likelihood function describing an infrasonic detection used in BISL has been redefined to include the von Mises distribution developed in directional statistics and propagation-based, physically derived celerity-range and azimuth deviation models. Frameworks for constructing propagation-based celerity-range and azimuth deviation statistics are presented to demonstrate how stochastic propagation modelling methods can be used to improve the precision and accuracy of the posterior probability density function describing the source localization. Infrasonic signals recorded at a number of arrays in the western United States produced by rocket motor detonations at the Utah Test and Training Range are used to demonstrate the application of the new mathematical framework and to quantify the improvement obtained by using the stochastic propagation modelling methods. Using propagation-based priors, the spatial and temporal confidence bounds of the source decreased by more than 40 per cent in all cases and by as much as 80 per cent in one case. Further, the accuracy of the estimates remained high, keeping the ground truth within the 99 per cent confidence bounds for all cases.
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.
Zhang, Le; Karakci, Ata; Korotkov, Andrei; Sutter, P M; Timbie, Peter T; Tucker, Gregory S; Wandelt, Benjamin D
2016-01-01
We present in this paper a new Bayesian semi-blind approach for foreground removal in observations of the 21-cm signal with interferometers. The technique, which we call HIEMICA (HI Expectation-Maximization Independent Component Analysis), is an extension of the Independent Component Analysis (ICA) technique developed for two-dimensional (2D) CMB maps to three-dimensional (3D) 21-cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from signal based on the diversity of their power spectra. Only relying on the statistical independence of the components, this approach can jointly estimate the 3D power spectrum of the 21-cm signal and, the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21-cm intensity mapping observations. Based on ...
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and th
Audio Source Separation Using a Deep Autoencoder
Jang, Giljin; Kim, Han-Gyu; Oh, Yung-Hwan
2014-01-01
This paper proposes a novel framework for unsupervised audio source separation using a deep autoencoder. The characteristics of unknown source signals mixed in the mixed input is automatically by properly configured autoencoders implemented by a network with many layers, and separated by clustering the coefficient vectors in the code layer. By investigating the weight vectors to the final target, representation layer, the primitive components of the audio signals in the frequency domain are o...
Blind Source Separation Using Hessian Evaluation
Jyothirmayi M; Elavaar Kuzhali S; Sethu Selvi S
2012-01-01
This paper focuses on the blind image separation using sparse representation for natural images. The statistics of the natural image is based on one particular statistical property called sparseness, which is closely related to the super-gaussian distribution. Since natural images can have both gaussian and non gaussian distribution, the original infomax algorithm cannot be directly used for source separation as it is better suited to estimate the super-gaussian sources. Hence, we explore the...
Transform domain steganography with blind source separation
Jouny, Ismail
2015-05-01
This paper applies blind source separation or independent component analysis for images that may contain mixtures of text, audio, or other images for steganography purposes. The paper focuses on separating mixtures in the transform domain such as Fourier domain or the Wavelet domain. The study addresses the effectiveness of steganography when using linear mixtures of multimedia components and the ability of standard blind sources separation techniques to discern hidden multimedia messages. Mixing in the space, frequency, and wavelet (scale) domains is compared. Effectiveness is measured using mean square error rate between original and recovered images.
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and the rest of the domestic wastewater (grey water). Different characteristics of wastewater streams facilitate recovery of energy, nutrients and fresh water. To ensure agricultural or ecological reuse ...
Rigid Structure from Motion from a Blind Source Separation Perspective
Fortuna, Jeff
2013-01-01
We present an information theoretic approach to define the problem of structure from motion (SfM) as a blind source separation one. Given that for almost all practical joint densities of shape points, the marginal densities are non-Gaussian, we show how higher-order statistics can be used to provide improvements in shape estimates over the methods of factorization via Singular Value Decomposition (SVD), bundle adjustment and Bayesian approaches. Previous techniques have either explicitly or implicitly used only second-order statistics in models of shape or noise. A further advantage of viewing SfM as a blind source problem is that it easily allows for the inclusion of noise and shape models, resulting in Maximum Likelihood (ML) or Maximum a Posteriori (MAP) shape and motion estimates. A key result is that the blind source separation approach has the ability to recover the motion and shape matrices without the need to explicitly know the motion or shape pdf. We demonstrate that it suffices to know whether the pdf is sub-or super-Gaussian (i.e., semi-parametric estimation) and derive a simple formulation to determine this from the data. We provide extensive experimental results on synthetic and real tracked points in order to quantify the improvement obtained from this technique. PMID:23682206
Blind source separation theory and applications
Yu, Xianchuan; Xu, Jindong
2013-01-01
A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies The book presents an overview of Blind Source Separation, a relatively new signal processing method. Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers
Predicting cytotoxicity from heterogeneous data sources with Bayesian learning
Langdon Sarah R
2010-12-01
Full Text Available Abstract Background We collected data from over 80 different cytotoxicity assays from Pfizer in-house work as well as from public sources and investigated the feasibility of using these datasets, which come from a variety of assay formats (having for instance different measured endpoints, incubation times and cell types to derive a general cytotoxicity model. Our main aim was to derive a computational model based on this data that can highlight potentially cytotoxic series early in the drug discovery process. Results We developed Bayesian models for each assay using Scitegic FCFP_6 fingerprints together with the default physical property descriptors. Pairs of assays that are mutually predictive were identified by calculating the ROC score of the model derived from one predicting the experimental outcome of the other, and vice versa. The prediction pairs were visualised in a network where nodes are assays and edges are drawn for ROC scores >0.60 in both directions. We observed that, if assay pairs (A, B and (B, C were mutually predictive, this was often not the case for the pair (A, C. The results from 48 assays connected to each other were merged in one training set of 145590 compounds and a general cytotoxicity model was derived. The model has been cross-validated as well as being validated with a set of 89 FDA approved drug compounds. Conclusions We have generated a predictive model for general cytotoxicity which could speed up the drug discovery process in multiple ways. Firstly, this analysis has shown that the outcomes of different assay formats can be mutually predictive, thus removing the need to submit a potentially toxic compound to multiple assays. Furthermore, this analysis enables selection of (a the easiest-to-run assay as corporate standard, or (b the most descriptive panel of assays by including assays whose outcomes are not mutually predictive. The model is no replacement for a cytotoxicity assay but opens the opportunity to be
Blind Source Separation: the Sparsity Revolution
Bobin, J.; Starck, Jean-Luc; Moudden, Y.; Fadili, Jalal M.
2008-01-01
Over the last few years, the development of multi-channel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-called blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have ...
Grading learning for blind source separation
张贤达; 朱孝龙; 保铮
2003-01-01
By generalizing the learning rate parameter to a learning rate matrix, this paper proposes agrading learning algorithm for blind source separation. The whole learning process is divided into threestages: initial stage, capturing stage and tracking stage. In different stages, different learning rates areused for each output component, which is determined by its dependency on other output components. Itis shown that the grading learning algorithm is equivariant and can keep the separating matrix from be-coming singular. Simulations show that the proposed algorithm can achieve faster convergence, bettersteady-state performance and higher numerical robustness, as compared with the existing algorithmsusing fixed, time-descending and adaptive learning rates.
Blind source separation problem in GPS time series
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition
Validi, AbdoulAhad
2013-01-01
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector valued sep...
Blind Source Separation for Speaker Recognition Systems
Unverdorben, Michael; Rothbucher, Martin; Diepold, Klaus
2014-01-01
In this thesis, a combined blind source separation (BSS) and speaker recognition approach for teleconferences is studied. By using a microphone array, consisting of eight microphones, different methods to perform overdetermined independent vector analysis (IVA) are compared. One method is to select a subset of microphones or all available microphones to perform IVA. The second method, the so called subspace method, that utilizes a principal component analysis (PCA) for dimensionality reductio...
Zhujie Chu
2016-02-01
Full Text Available Municipal household solid waste (MHSW has become a serious problem in China over the course of the last two decades, resulting in significant side effects to the environment. Therefore, effective management of MHSW has attracted wide attention from both researchers and practitioners. Separate collection, the first and crucial step to solve the MHSW problem, however, has not been thoroughly studied to date. An empirical survey has been conducted among 387 households in Harbin, China in this study. We use Bayesian Belief Networks model to determine the influencing factors on separate collection. Four types of factors are identified, including political, economic, social cultural and technological based on the PEST (political, economic, social and technological analytical method. In addition, we further analyze the influential power of different factors, based on the network structure and probability changes obtained by Netica software. Results indicate that technological dimension has the greatest impact on MHSW separate collection, followed by the political dimension and economic dimension; social cultural dimension impacts MHSW the least.
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow
We present, in this paper, a new unsupervised method for joint image super-resolution and separation between smooth and point sources. For this purpose, we propose a Bayesian approach with a Markovian model for the smooth part and Student’s t-distribution for point sources. All model and noise parameters are considered unknown and should be estimated jointly with images. However, joint estimators (joint MAP or posterior mean) are intractable and an approximation is needed. Therefore, a new gradient-like variational Bayesian method is applied to approximate the true posterior by a free-form separable distribution. A parametric form is obtained by approximating marginals but with form parameters that are mutually dependent. Their optimal values are achieved by iterating them till convergence. The method was tested by the model-generated data and a real dataset from the Herschel space observatory. (paper)
Blind Source Separation For Ion Mobility Spectra
Miniaturization is a powerful trend for smart chemical instrumentation in a diversity of applications. It is know that miniaturization in IMS leads to a degradation of the system characteristics. For the present work, we are interested in signal processing solutions to mitigate limitations introduced by limited drift tube length that basically involve a loss of chemical selectivity. While blind source separation techniques (BSS) are popular in other domains, their application for smart chemical instrumentation is limited. However, in some conditions, basically linearity, BSS may fully recover the concentration time evolution and the pure spectra with few underlying hypothesis. This is extremely helpful in conditions where non-expected chemical interferents may appear, or unwanted perturbations may pollute the spectra. SIMPLISMA has been advocated by Harrington et al. in several papers. However, more modern methods of BSS for bilinear decomposition with the restriction of positiveness have appeared in the last decade. In order to explore and compare the performances of those methods a series of experiments were performed.
Bayesian Blind Separation and Deconvolution of Dynamic Image Sequences Using Sparsity Priors
Tichý, Ondřej; Šmídl, Václav
2015-01-01
Roč. 34, č. 1 (2015), s. 258-266. ISSN 0278-0062 R&D Projects: GA ČR GA13-29225S Keywords : Functional imaging * Blind source separation * Computer-aided detection and diagnosis * Probabilistic and statistical methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.390, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/tichy-0431090.pdf
Characterizing the Aperiodic Variability of 3XMM Sources using Bayesian Blocks
Salvetti, D.; De Luca, A.; Belfiore, A.; Marelli, M.
2016-06-01
I will present Bayesian blocks algorithm and its application to XMM sources, statistical properties of the entire 3XMM sample, and a few interesting cases. While XMM-Newton is the best suited instrument for the characterization of X-ray source variability, its most recent catalogue (3XMM) reports light curves only for the brightest ones and excludes from its analysis periods of background flares. One aim of the EXTraS ("Exploring the X-ray Transient and variable Sky") project is the characterization of aperiodic variability of as many 3XMM sources as possible on a time scale shorter than the XMM observation. We adapted the original Bayesian blocks algorithm to account for background contamination, including soft proton flares. In addition, we characterized the short-term aperiodic variability performing a number of statistical tests on all the Bayesian blocks light curves. The EXTraS catalogue and products will be released to the community in 2017, together with tools that will allow the user to replicate EXTraS results and extend them through the next decade.
Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.
2015-09-01
In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.
Blind source separation using second-order cyclostationary statistics
Abed-Meraim, Karim; Xiang, Yong; Manton, Jonathan H.; Hua, Yingbo
2001-01-01
This paper studies the blind source separation (BSS) problem with the assumption that the source signals are cyclostationary. Identifiability and separability criteria based on second-order cyclostationary statistics (SOCS) alone are derived. The identifiability condition is used to define an appropriate contrast function. An iterative algorithm (ATH2) is derived to minimize this contrast function. This algorithm separates the sou...
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion
Zanini, Andrea; Woodbury, Allan D.
2016-02-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
Occurrence of hazardous accident in nuclear power plants and industrial units usually lead to release of radioactive materials and pollutants in environment. These materials and pollutants can be transported to a far downstream by the wind flow. In this paper, we implemented an atmospheric dispersion code to solve the inverse problem. Having received and detected the pollutants in one region, we may estimate the rate and location of the unknown source. For the modeling, one needs a model with ability of atmospheric dispersion calculation. Furthermore, it is required to implement a mathematical approach to infer the source location and the related rates. In this paper the AERMOD software and Bayesian inference along the Markov Chain Monte Carlo have been applied. Implementing, Bayesian approach and Markov Chain Monte Carlo for the aforementioned subject is not a new approach, but the AERMOD model coupled with the said methods is a new and well known regulatory software, and enhances the reliability of outcomes. To evaluate the method, an example is considered by defining pollutants concentration in a specific region and then obtaining the source location and intensity by a direct calculation. The result of the calculation estimates the average source location at a distance of 7km with an accuracy of 5m which is good enough to support the ability of the proposed algorithm.
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.; McCall, Jonathon D.; Prinke, Amanda M.; Webster, Jennifer B.; Seifert, Carolyn E.
2015-06-01
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search.
Miller, Erin A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Robinson, Sean M. [Pacific Northwest National Lab. (PNNL), Seattle, WA (United States); Anderson, Kevin K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCall, Jonathon D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prinke, Amanda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Seifert, Carolyn E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
Hald, Tine; Vose, D.; Wegener, Henrik Caspar;
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused...... by different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting...
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
Hald, Tine; Vose, D.; Wegener, Henrik Caspar; Koupeev, T.
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused by...... different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting...
Application of evidence theory in information fusion of multiple sources in bayesian analysis
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
Magnetic source separation in Earth's outer core.
Hoffman, Kenneth A; Singer, Brad S
2008-09-26
We present evidence that the source of Earth's axial dipole field is largely independent from the sources responsible for the rest of the geomagnetic field, the so-called nonaxial dipole (NAD) field. Support for this claim comes from correlations between the structure of the historic field and the behavior of the paleomagnetic field recorded in precisely dated lavas at those times when the axial dipole was especially weak or nearly absent. It is argued that a "stratification" of magnetic sources exists in the fluid core such that the axial dipole is the only observed field component that is nearly immune from the influence exerted by the lowermost mantle. It follows that subsequent work on spherical harmonic-based field descriptions may now incorporate an understanding of a dichotomy of spatial-temporal dynamo processes. PMID:18818352
On merging rainfall data from diverse sources using a Bayesian approach
Bhattacharya, Biswa; Tarekegn, Tegegne
2014-05-01
Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling
Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L
2010-07-01
This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out. PMID:20649202
Adaptive blind source separation with HRTFs beamforming preprocessing
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-01-01
We propose an adaptive blind source separation algorithm in the context of robot audition using a microphone array. Our algorithm presents two steps: a fixed beamforming step to reduce the reverberation and the background noise and a source separation step. In the fixed beamforming preprocessing, we build the beamforming filters using the Head Related Transfer Functions (HRTFs) which allows us to take into consideration the effect of the robot's head on the near acoustic field. In the source ...
Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1
Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)
2012-09-15
The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)
Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.
2016-01-01
This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion
Separating More Sources Than Sensors Using Time-Frequency Distributions
Belouchrani Adel
2005-01-01
Full Text Available We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs. The underlying assumption is that the original sources are disjoint in the time-frequency (TF domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.
Separating More Sources Than Sensors Using Time-Frequency Distributions
Linh-Trung, Nguyen; Belouchrani, Adel; Abed-Meraim, Karim; Boashash, Boualem
2005-12-01
We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF) signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs). The underlying assumption is that the original sources are disjoint in the time-frequency (TF) domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD) matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.
Single channel blind source separation based on ICA feature extraction
无
2007-01-01
A new technique is proposed to solve the blind source separation (BSS) given only a single channel observation. The basis functions and the density of the coefficients of source signals learned by ICA are used as the prior knowledge. Based on the learned prior information the learning rules of single channel BSS are presented by maximizing the joint log likelihood of the mixed sources to obtain source signals from single observation,in which the posterior density of the given measurements is maximized. The experimental results exhibit a successful separation performance for mixtures of speech and music signals.
Novel blind source separation algorithm using Gaussian mixture density function
孔薇; 杨杰; 周越
2004-01-01
The blind source separation (BSS) is an important task for numerous applications in signal processing, communications and array processing. But for many complex sources blind separation algorithms are not efficient because the probability distribution of the sources cannot be estimated accurately. So in this paper, to justify the ME(maximum enteropy) approach, the relation between the ME and the MMI(minimum mutual information) is elucidated first. Then a novel algorithm that uses Gaussian mixture density to approximate the probability distribution of the sources is presented based on the ME approach. The experiment of the BSS of ship-radiated noise demonstrates that the proposed algorithm is valid and efficient.
Oh, Geok Lian
This PhD study examines the use of seismic technology for the problem of detecting underground facilities, whereby a seismic source such as a sledgehammer is used to generate seismic waves through the ground, sensed by an array of seismic sensors on the ground surface, and recorded by the digital...... device. The concept is similar to the techniques used in exploration seismology, in which explosions (that occur at or below the surface) or vibration wave-fronts generated at the surface reflect and refract off structures at the ground depth, so as to generate the ground profile of the elastic material...... density values of the discretized ground medium, which leads to time-consuming computations and instability behaviour of the inversion process. In addition, the geophysics inverse problem is generally ill-posed due to non-exact forward model that introduces errors. The Bayesian inversion method through...
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)
2013-10-15
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Anezaki, Katsunori; Nakano, Takeshi; Kashiwagi, Nobuhisa
2016-01-19
Using the chemical balance method, and considering the presence of unidentified sources, we estimated the origins of PCB contamination in surface sediments of Muroran Port, Japan. It was assumed that these PCBs originated from four types of Kanechlor products (KC300, KC400, KC500, and KC600), combustion and two kinds of pigments (azo and phthalocyanine). The characteristics of these congener patterns were summarized on the basis of principal component analysis and explanatory variables determined. A Bayesian semifactor model (CMBK2) was applied to the explanatory variables to analyze the sources of PCBs in the sediments. The resulting estimates of the contribution ratio of each kind of sediment indicate that the existence of unidentified sources can be ignored and that the assumed seven sources are adequate to account for the contamination. Within the port, the contribution ratio of KC500 and KC600 (used as paints for ship hulls) was extremely high, but outside the port, the influence of azo pigments was observable to a limited degree. This indicates that environmental PCBs not derived from technical PCBs are present at levels that cannot be ignored. PMID:26716388
Source separation of household waste: A case study in China
A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference
Separation of synchronous sources through phase locked matrix factorization.
Almeida, Miguel S B; Vigário, Ricardo; Bioucas-Dias, José
2014-10-01
In this paper, we study the separation of synchronous sources (SSS) problem, which deals with the separation of sources whose phases are synchronous. This problem cannot be addressed through independent component analysis methods because synchronous sources are statistically dependent. We present a two-step algorithm, called phase locked matrix factorization (PLMF), to perform SSS. We also show that SSS is identifiable under some assumptions and that any global minimum of PLMFs cost function is a desirable solution for SSS. We extensively study the algorithm on simulated data and conclude that it can perform SSS with various numbers of sources and sensors and with various phase lags between the sources, both in the ideal (i.e., perfectly synchronous and nonnoisy) case, and with various levels of additive noise in the observed signals and of phase jitter in the sources. PMID:25291741
Prasad, Sudhakar
2014-01-01
We present an asymptotic analysis of the minimum probability of error (MPE) in inferring the correct hypothesis in a Bayesian multi-hypothesis testing (MHT) formalism using many pixels of data that are corrupted by signal dependent shot noise, sensor read noise, and background illumination. We perform this error analysis for a variety of combined noise and background statistics, including a pseudo-Gaussian distribution that can be employed to treat approximately the photon-counting statistics of signal and background as well as purely Gaussian sensor read-out noise and more general, exponentially peaked distributions. We subsequently apply the MPE asymptotics to characterize the minimum conditions needed to localize a point source in three dimensions by means of a rotating-PSF imager and compare its performance with that of a conventional imager in the presence of background and sensor-noise fluctuations. In a separate paper, we apply the formalism to the related but qualitatively different problem of 2D supe...
Abban, B.; (Thanos) Papanicolaou, A. N.; Cowles, M. K.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schilling, K.; Schnoebelen, D.
2016-06-01
An enhanced revision of the Fox and Papanicolaou (hereafter referred to as "F-P") (2008a) Bayesian, Markov Chain Monte Carlo fingerprinting framework for estimating sediment source contributions and their associated uncertainties is presented. The F-P framework included two key deterministic parameters, α and β, that, respectively, reflected the spatial origin attributes of sources and the time history of eroded material delivered to and collected at the watershed outlet. However, the deterministic treatment of α and β is limited to cases with well-defined spatial partitioning of sources, high sediment delivery, and relatively short travel times with little variability in transport within the watershed. For event-based studies in intensively managed landscapes, this may be inadequate since landscape heterogeneity results in variabilities in source contributions, their pathways, delivery times, and storage within the watershed. Thus, probabilistic treatments of α and β are implemented in the enhanced framework to account for these variabilities. To evaluate the effects of the treatments of α and β on source partitioning, both frameworks are applied to the South Amana Subwatershed (SASW) in the U.S. midwest. The enhanced framework is found to estimate mean source contributions that are in good agreement with estimates from other studies in SASW. The enhanced framework is also able to produce expected trends in uncertainty during the study period, unlike the F-P framework, which does not perform as expected. Overall, the enhanced framework is found to be less sensitive to changes in α and β than the F-P framework, and, therefore, is more robust and desirable from a management standpoint.
Rasheda Arman Chowdhury
Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real
Extended Nonnegative Tensor Factorisation Models for Musical Sound Source Separation
Derry FitzGerald
2008-01-01
Full Text Available Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented
George, J.S.; Schmidt, D.M.; Wood, C.C.
1999-02-01
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.
Koldovský, Zbyněk; Tichavský, Petr; Málek, J.
Vol. 6365. Heidelberg : Springer-Verlag, 2010 - (Gavrilova, M.; Kumar, V.; Mun, Y.; Tan, C.; Gervasi, O.), s. 17-24 ISBN 978-3-642-15994-7. [Latent Variable Analysis and Signal Separation. St. Malo (FR), 27.09.2010-30.09.2010] R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * audio * convolutive mixture Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/SI/tichavsky-time-domain blind audio source separation method producing separating filters of generalized feedforward structure.pdf
Hosseini-zad, K.; Stähler, S. C.; Sigloch, K.; Scheingraber, C.
2012-04-01
Seismic tomography has made giant progresses in the last decade. This has been due to improvements in the method, which allowed to combine the high information content of waveform modeling with the mathematically sound methods of tomographic inversion. The second factor is the vast growth of digitally available broadband seismograms. Both factors together require efficient processing schemes for seismic waveforms, which reduce the necessary manual interaction to a minimum. Since the data growth has mainly taken place on traditionally well instrumented regions, many areas are still sparsely instrumented, so the processing scheme should treat all data with highest care. Our processing scheme "No data left behind", which is implemented in Python and incorporated into the seismology package ObsPy automates the steps for global or regional body wave tomography: 1. Data retrieval: Downloading of event-based seismic waveforms from ORFEUS and IRIS. This way around 1600 stations globally are available. Data from other sources can be added manually. 2. Preprocessing: Deconvolution of instrument responses, recognition of bad recordings and automated correction, if possible. No rejection is done in this stage. 3. Cutting of time windows around body wave phases, decomposition of the signals into 6 frequency bands (20s to 1 Hz), individual determination of SNR and similarity to synthetic waveforms. 4. Rejection of bad windows. Since the rejection is done based on SNR or CC with synthetics independently for each of the 6 frequency bands, even very noisy stations like ocean islands are not discarded completely. 5. Bayesian Source inversion: The source parameters including depth, CMT and Source Time Function are determined in a probabilistic way using a wavelet base and P- and SH-waveforms. The whole algorithm is modular and additional modules (e.g. for OBS preprocessing) can be selected individually.
Separation of core and crustal magnetic field sources
Shure, L.; Parker, R. L.; Langel, R. A.
1985-01-01
Fluid motions in the electrically conducting core and magnetized crustal rocks are the two major sources of the magnetic field observed on or slightly above the Earth's surface. The exact separation of these two contributions is not possible without imposing a priori assumptions about the internal source distribution. Nonetheless models like these were developed for hundreds of years Gauss' method, least squares analysis with a truncated spherical harmonic expansion was the method of choice for more than 100 years although he did not address separation of core and crustal sources, but rather internal versus external ones. Using some arbitrary criterion for appropriate truncation level, we now extrapolate downward core field models through the (approximately) insulating mantle. Unfortunately our view can change dramatically depending on the degree of truncation for describing core sources.
Sparsity and Morphological Diversity in Blind Source Separation
Bobin, Jérome; Starck, Jean-Luc; Fadili, Jalal M.; Moudden, Yassir
2007-01-01
Over the last few years, the development of multichannel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-caIled blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have e...
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Abdeldjalil Aïssa-El-Bey; Karim Abed-Meraim; Yves Grenier
2007-01-01
This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal) components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components) followed by a signal synthesis (grouping of the components belonging to the same source) using vector clustering. For the signa...
Jutten, Christian; Karhunen, Juha
2004-10-01
In this paper, we review recent advances in blind source separation (BSS) and independent component analysis (ICA) for nonlinear mixing models. After a general introduction to BSS and ICA, we discuss in more detail uniqueness and separability issues, presenting some new results. A fundamental difficulty in the nonlinear BSS problem and even more so in the nonlinear ICA problem is that they provide non-unique solutions without extra constraints, which are often implemented by using a suitable regularization. In this paper, we explore two possible approaches. The first one is based on structural constraints. Especially, post-nonlinear mixtures are an important special case, where a nonlinearity is applied to linear mixtures. For such mixtures, the ambiguities are essentially the same as for the linear ICA or BSS problems. The second approach uses Bayesian inference methods for estimating the best statistical parameters, under almost unconstrained models in which priors can be easily added. In the later part of this paper, various separation techniques proposed for post-nonlinear mixtures and general nonlinear mixtures are reviewed. PMID:15593377
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Aïssa-El-Bey Abdeldjalil
2007-01-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Abdeldjalil Aïssa-El-Bey
2007-03-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
How Many Separable Sources? Model Selection In Independent Components Analysis
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis...... computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher’s iris data set and Howells’ craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed
A well-tested and validated Gibbs sampling code, that performs component separation and cosmic microwave background (CMB) power spectrum estimation, was applied to the WMAP five-year data. Using a simple model consisting of CMB, noise, monopoles, and dipoles, a 'per pixel' low-frequency power-law (fitting for both amplitude and spectral index), and a thermal dust template with a fixed spectral index, we found that the low-l (l < 50) CMB power spectrum is in good agreement with the published WMAP5 results. Residual monopoles and dipoles were found to be small (∼<3 μK) or negligible in the five-year data. We comprehensively tested the assumptions that were made about the foregrounds (e.g., dust spectral index, power-law spectral index prior, templates), and found that the CMB power spectrum was insensitive to these choices. We confirm the asymmetry of power between the north and south ecliptic hemispheres, which appears to be robust against foreground modeling. The map of low-frequency spectral indices indicates a steeper spectrum on average (β = -2.97 ± 0.21) relative to those found at low (∼GHz) frequencies.
Jank, Anna; Müller, Wolfgang; Schneider, Irene; Gerke, Frederic; Bockreis, Anke
2015-05-01
An efficient biological treatment of source separated organic waste from household kitchens and gardens (biowaste) requires an adequate upfront mechanical preparation which possibly includes a hand sorting for the separation of contaminants. In this work untreated biowaste from households and gardens and the screen overflow >60mm of the same waste were mechanically treated by a Waste Separation Press (WSP). The WSP separates the waste into a wet fraction for biological treatment and a fraction of dry contaminants for incineration. The results show that it is possible to replace a hand sorting of contaminants, the milling and a screening of organic waste before the biological treatment by using the WSP. A special focus was put on the contaminants separation. The separation of plastic film from the untreated biowaste was 67% and the separation rate of glass was about 92%. About 90% of the organics were transferred to the fraction for further biological treatment. When treating the screen overflow >60mm with the WSP 86% of the plastic film and 88% of the glass were transferred to the contaminants fraction. 32% of the organic was transferred to the contaminants fraction and thereby lost for a further biological treatment. Additionally it was calculated that national standards for glass contaminants in compost can be met when using the WSP to mechanically treat the total biowaste. The loss of biogas by transferring biodegradable organics to the contaminants fraction was about 11% when preparing the untreated biowaste with the WSP. PMID:25761398
FREQUENCY OVERLAPPED SIGNAL IDENTIFICATION USING BLIND SOURCE SEPARATION
WANG Junfeng; SHI Tielin; HE Lingsong; YANG Shuzi
2006-01-01
The concepts, principles and usages of principal component analysis (PCA) and independent component analysis (ICA) are interpreted. Then the algorithm and methodology of ICA-based blind source separation (BSS), in which the pre-whitened based on PCA for observed signals is used, are researched. Aiming at the mixture signals, whose frequency components are overlapped by each other, a simulation of BSS to separate this type of mixture signals by using theory and approach of BSS has been done. The result shows that the BSS has some advantages what the traditional methodology of frequency analysis has not.
Source separation and clustering of phase-locked subspaces.
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José Mario; Vigário, Ricardo
2011-09-01
It has been proven that there are synchrony (or phase-locking) phenomena present in multiple oscillating systems such as electrical circuits, lasers, chemical reactions, and human neurons. If the measurements of these systems cannot detect the individual oscillators but rather a superposition of them, as in brain electrophysiological signals (electro- and magneoencephalogram), spurious phase locking will be detected. Current source-extraction techniques attempt to undo this superposition by assuming properties on the data, which are not valid when underlying sources are phase-locked. Statistical independence of the sources is one such invalid assumption, as phase-locked sources are dependent. In this paper, we introduce methods for source separation and clustering which make adequate assumptions for data where synchrony is present, and show with simulated data that they perform well even in cases where independent component analysis and other well-known source-separation methods fail. The results in this paper provide a proof of concept that synchrony-based techniques are useful for low-noise applications. PMID:21791409
A source separation approach to enhancing marine mammal vocalizations.
Gur, M Berke; Niezrecki, Christopher
2009-12-01
A common problem in passive acoustic based marine mammal monitoring is the contamination of vocalizations by a noise source, such as a surface vessel. The conventional approach in improving the vocalization signal to noise ratio (SNR) is to suppress the unwanted noise sources by beamforming the measurements made using an array. In this paper, an alternative approach to multi-channel underwater signal enhancement is proposed. Specifically, a blind source separation algorithm that extracts the vocalization signal from two-channel noisy measurements is derived and implemented. The proposed algorithm uses a robust decorrelation criterion to separate the vocalization from background noise, and hence is suitable for low SNR measurements. To overcome the convergence limitations resulting from temporally correlated recordings, the supervised affine projection filter update rule is adapted to the unsupervised source separation framework. The proposed method is evaluated using real West Indian manatee (Trichechus manatus latirostris) vocalizations and watercraft emitted noise measurements made within a typical manatee habitat in Florida. The results suggest that the proposed algorithm can improve the detection range of a passive acoustic detector five times on average (for input SNR between -10 and 5 dB) using only two receivers. PMID:20000920
An autonomous surveillance system for blind sources localization and separation
Wu, Sean; Kulkarni, Raghavendra; Duraiswamy, Srikanth
2013-05-01
This paper aims at developing a new technology that will enable one to conduct an autonomous and silent surveillance to monitor sound sources stationary or moving in 3D space and a blind separation of target acoustic signals. The underlying principle of this technology is a hybrid approach that uses: 1) passive sonic detection and ranging method that consists of iterative triangulation and redundant checking to locate the Cartesian coordinates of arbitrary sound sources in 3D space, 2) advanced signal processing to sanitizing the measured data and enhance signal to noise ratio, and 3) short-time source localization and separation to extract the target acoustic signals from the directly measured mixed ones. A prototype based on this technology has been developed and its hardware includes six B and K 1/4-in condenser microphones, Type 4935, two 4-channel data acquisition units, Type NI-9234, with a maximum sampling rate of 51.2kS/s per channel, one NI-cDAQ 9174 chassis, a thermometer to measure the air temperature, a camera to view the relative positions of located sources, and a laptop to control data acquisition and post processing. Test results for locating arbitrary sound sources emitting continuous, random, impulsive, and transient signals, and blind separation of signals in various non-ideal environments is presented. This system is invisible to any anti-surveillance device since it uses the acoustic signal emitted by a target source. It can be mounted on a robot or an unmanned vehicle to perform various covert operations, including intelligence gathering in an open or a confined field, or to carry out the rescue mission to search people trapped inside ruins or buried under wreckages.
Evaluating source separation of plastic waste using conjoint analysis.
Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke
2008-11-01
Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues. PMID:18207727
Blind source separation advances in theory, algorithms and applications
Wang, Wenwu
2014-01-01
Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.
Blind source separation for robot audition using fixed HRTF beamforming
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-12-01
In this article, we present a two-stage blind source separation (BSS) algorithm for robot audition. The first stage consists in a fixed beamforming preprocessing to reduce the reverberation and the environmental noise. Since we are in a robot audition context, the manifold of the sensor array in this case is hard to model due to the presence of the head of the robot, so we use pre-measured head related transfer functions (HRTFs) to estimate the beamforming filters. The use of the HRTF to estimate the beamformers allows to capture the effect of the head on the manifold of the microphone array. The second stage is a BSS algorithm based on a sparsity criterion which is the minimization of the l 1 norm of the sources. We present different configuration of our algorithm and we show that it has promising results and that the fixed beamforming preprocessing improves the separation results.
Blidholm, O.; Wiklund, S.E. [AaF-Energikonsult (Sweden); Bauer, A.C. [Energikonsult A. Bauer (Sweden)
1997-02-01
The basic idea of this project is to study the possibilities to use source separated combustible material for energy conversion in conventional solid fuel boilers (i.e. not municipal waste incineration plants). The project has been carried out in three phases. During phase 1 and 2 a number of fuel analyses of different fractions were carried out. During phase 3 two combustion tests were carried out; (1) a boiler with grate equipped with cyclone, electrostatic precipitator and flue gas condenser, and (2) a bubbling fluidized bed boiler with electrostatic precipitator and flue gas condenser. During the tests source separated paper and plastic packagings were co-fired with biomass fuels. The mixing rate of packagings was approximately 15%. This study reports the results of phase 3 and the conclusions of the whole project. The technical terms of using packaging as fuel are good. The technique is available for shredding both paper and plastic packaging. The material can be co-fired with biomass. The economical terms of using source separated packaging for energy conversion can be very advantageous, but can also form obstacles. The result is to a high degree guided by such facts as how the fuel is collected, transported, reduced in size and handled at the combustion plant. The results of the combustion tests show that the environmental terms of using source separated packaging for energy conversion are good. The emissions of heavy metals into the atmosphere are very low. The emissions are well below the emission standards for waste incineration plants. 35 figs, 13 tabs, 8 appendices
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
Acoustic emission (AE) is a well-established nondestructive testing method for assessing the condition of liquid-filled tanks. Often the tank can be tested without the need for accurate location of AE sources. But sometimes, accurate location is required, such as in the case of follow-up inspections after AE has indicated a significant defect. Traditional computed location techniques that considered only the wave traveling through the shell of the tank have not proved reliable when applied to liquid-filled tanks. This because AE sensors are often responding to liquid-borne waves, that are not considered in the traditional algorithms. This paper describes an approach for locating AE sources on the wall of liquid filled tanks that includes two novel aspects: (i) the use of liquid-borne waves, and (ii) the use of a probabilistic algorithm. The proposed algorithm is developed within a Bayesian framework that considers uncertainties in the wave velocities and the time of arrival. A Markov Chain Monte Carlo is used to estimate the distribution of the AE source location. This approach was applied on a 102 inch diameter (29 000 gal) railroad tank car by estimating the source locations from pencil lead break with waveforms recorded. Results show that the proposed Bayesian approach for source location can be used to calculate the most probable region of the tank wall where the AE source is located. (paper)
Defining the force between separated sources on a light front
The Newtonian character of gauge theories on a light front requires that the longitudinal momentum P+, which plays the role of Newtonian mass, be conserved. This requirement conflicts with the standard definition of the force between two sources in terms of the minimal energy of quantum gauge fields in the presence of a quark and anti-quark pinned to points separated by a distance R. We propose that, on a light front, the force be defined by minimizing the energy of gauge fields in the presence of a quark and an anti-quark pinned to lines (1-branes) oriented in the longitudinal direction singled out by the light front and separated by a transverse distance R. Such sources will have a limited 1+1 dimensional dynamics. We study this proposal for weak coupling gauge theories by showing how it leads to the Coulomb force law. For QCD we also show how asymptotic freedom emerges by evaluating the S matrix through one loop for the scattering of a particle in the Nc representation of color SU(Nc) on a 1-brane by a particle in the bar Nc representation of color on a parallel 1-brane separated from the first by a distance RQCD. Potential applications to the problem of confinement on a light front are discussed. copyright 1999 The American Physical Society
To identify different NO3− sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ15N–NO3− were between 8.0 and 19.4‰, while annual mean δ18O–NO3− were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO3− sources (NO3− in precipitation, NO3− fertilizer, NH4+ in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO3− fertilizer” and “NH4+ in fertilizer and rain” contributed middle, and “NO3− in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO3− source contributions. However, the wide range of isotope values observed in surface water and of the NO3− sources limit its applicability. - Highlights: ► The dual isotope approach (δ15N- and δ18O–NO3−) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO3− sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.
Blind Separation of Piecewise Stationary NonGaussian Sources
Koldovský, Zbyněk; Málek, J.; Tichavský, Petr; Deville, Y.; Hosseini, S.
2009-01-01
Roč. 89, č. 12 (2009), s. 2570-2584. ISSN 0165-1684 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Grant ostatní: GA ČR(CZ) GA102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : Independent component analysis * blind source separation * Cramer-Rao lower bound Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.135, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/tichavsky-blind separationofpiecewisestationarynon-gaussiansources.pdf
Blind Source Separation for Robot Audition using fixed HRTF beamforming
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-01-01
In this article, we present a two-stage blind source separation (BSS) algorithm for robot audition. The first stage consists in a fixed beamforming preprocessing to reduce the reverberation and the environmental noise. Since we are in a robot audition context, the manifold of the sensor array in this case is hard to model due to the presence of the head of the robot, so we use pre-measured head related transfer functions (HRTFs) to estimate the beamforming filters. The use of the HRTF to esti...
Pires, Sara Monteiro; Hald, Tine
2010-01-01
disease. These differences presumably represent multiple factors, such as differences in survivability through the food chain and/or pathogenicity. The relative importance of the source-dependent factors varied considerably over the years, reflecting, among others, variability in the surveillance programs......Salmonella is a major cause of human gastroenteritis worldwide. To prioritize interventions and assess the effectiveness of efforts to reduce illness, it is important to attribute salmonellosis to the responsible sources. Studies have suggested that some Salmonella subtypes have a higher health...... impact than others. Likewise, some food sources appear to have a higher impact than others. Knowledge of variability in the impact of subtypes and sources may provide valuable added information for research, risk management, and public health strategies. We developed a Bayesian model that attributes...
The Leuven isotope separator on-line laser ion source
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. 54,55Ni and 54,55Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of 238U. The b decay of the 68-74Ni, 67-70Co, 70-75Cu and 110-114Rh isotopes has been studied by means of β-γ and γ-γ spectroscopy. Recently, the laser ion source has been used to produce neutron-deficient rhodium and ruthenium isotopes (91-95Rh, 98Rh, 90,91Ru) near the N=Z line in heavy ion-induced fusion reactions
Compressive Source Separation: Theory and Methods for Hyperspectral Imaging
Golbabaee, Mohammad; Arberet, Simon; Vandergheynst, Pierre
2013-12-01
With the development of numbers of high resolution data acquisition systems and the global requirement to lower the energy consumption, the development of efficient sensing techniques becomes critical. Recently, Compressed Sampling (CS) techniques, which exploit the sparsity of signals, have allowed to reconstruct signal and images with less measurements than the traditional Nyquist sensing approach. However, multichannel signals like Hyperspectral images (HSI) have additional structures, like inter-channel correlations, that are not taken into account in the classical CS scheme. In this paper we exploit the linear mixture of sources model, that is the assumption that the multichannel signal is composed of a linear combination of sources, each of them having its own spectral signature, and propose new sampling schemes exploiting this model to considerably decrease the number of measurements needed for the acquisition and source separation. Moreover, we give theoretical lower bounds on the number of measurements required to perform reconstruction of both the multichannel signal and its sources. We also proposed optimization algorithms and extensive experimentation on our target application which is HSI, and show that our approach recovers HSI with far less measurements and computational effort than traditional CS approaches.
The Leuven isotope separator on-line laser ion source
Kudryavtsev, Y; Franchoo, S; Huyse, M; Gentens, J; Kruglov, K; Müller, W F; Prasad, N V S; Raabe, R; Reusen, I; Van den Bergh, P; Van Duppen, P; Van Roosbroeck, J; Vermeeren, L; Weissman, L
2002-01-01
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. sup 5 sup 4 sup , sup 5 sup 5 Ni and sup 5 sup 4 sup , sup 5 sup 5 Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of sup 2 sup 3 sup 8 U. The b decay of the sup 6 sup 8 sup - sup 7 sup 4 Ni, sup 6 sup 7 sup - sup 7 sup 0 Co, sup 7 sup 0 sup - sup 7 sup 5 Cu and sup 1 sup 1 sup 0 sup - sup 1 sup 1 sup 4 Rh isotopes has been studied by means of beta-gamma and gamma-gamma spectroscopy. Recently, the laser ion source has been used to produce neutron-d...
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...
Exploiting Narrowband Efficiency for Broadband Convolutive Blind Source Separation
Aichner Robert
2007-01-01
Full Text Available Based on a recently presented generic broadband blind source separation (BSS algorithm for convolutive mixtures, we propose in this paper a novel algorithm combining advantages of broadband algorithms with the computational efficiency of narrowband techniques. By selective application of the Szegö theorem which relates properties of Toeplitz and circulant matrices, a new normalization is derived as a special case of the generic broadband algorithm. This results in a computationally efficient and fast converging algorithm without introducing typical narrowband problems such as the internal permutation problem or circularity effects. Moreover, a novel regularization method for the generic broadband algorithm is proposed and subsequently also derived for the proposed algorithm. Experimental results in realistic acoustic environments show improved performance of the novel algorithm compared to previous approximations.
Iterative compressive sampling for hyperspectral images via source separation
Kamdem Kuiteing, S.; Barni, Mauro
2014-03-01
Compressive Sensing (CS) is receiving increasing attention as a way to lower storage and compression requirements for on-board acquisition of remote-sensing images. In the case of multi- and hyperspectral images, however, exploiting the spectral correlation poses severe computational problems. Yet, exploiting such a correlation would provide significantly better performance in terms of reconstruction quality. In this paper, we build on a recently proposed 2D CS scheme based on blind source separation to develop a computationally simple, yet accurate, prediction-based scheme for acquisition and iterative reconstruction of hyperspectral images in a CS setting. Preliminary experiments carried out on different hyperspectral images show that our approach yields a dramatic reduction of computational time while ensuring reconstruction performance similar to those of much more complicated 3D reconstruction schemes.
Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D
2015-08-01
Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted. PMID:25391388
Blind source separation with unknown and dynamically changing number of source signals
YE Jimin; ZHANG Xianda; ZHU Xiaolong
2006-01-01
The contrast function remains to be an open problem in blind source separation (BSS) when the number of source signals is unknown and/or dynamically changed.The paper studies this problem and proves that the mutual information is still the contrast function for BSS if the mixing matrix is of full column rank. The mutual information reaches its minimum at the separation points, where the random outputs of the BSS system are the scaled and permuted source signals, while the others are zero outputs. Using the property that the transpose of the mixing matrix and a matrix composed by m observed signals have the indentical null space with probability one, a practical method, which can detect the unknown number of source signals n, ulteriorly traces the dynamical change of the sources number with a few of data, is proposed. The effectiveness of the proposed theorey and the developed novel algorithm is verified by adaptive BSS simulations with unknown and dynamically changing number of source signals.
Decentralized modal identification using sparse blind source separation
Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time–frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Chris von Borgstede
2012-06-01
Full Text Available Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS. In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to employees at different workplaces. The results show that respondents with awareness of EMS report higher levels of source separation at work, stronger environmental concern, personal and social norms, and perceive source separation to be less difficult. Furthermore, the results support the notion that after the adoption of EMS at the workplace, source separation at work spills over into source separation in the household. The potential implications for environmental management systems are discussed.
Using the FASST source separation toolbox for noise robust speech recognition
Ozerov, Alexey; Vincent, Emmanuel
2011-01-01
We describe our submission to the 2011 CHiME Speech Separation and Recognition Challenge. Our speech separation algorithm was built using the Flexible Audio Source Separation Toolbox (FASST) we developed recently. This toolbox is an implementation of a general flexible framework based on a library of structured source models that enable the incorporation of prior knowledge about a source separation problem via user-specifiable constraints. We show how to use FASST to develop an efficient spee...
Residents’ Household Solid Waste (HSW Source Separation Activity: A Case Study of Suzhou, China
Hua Zhang
2014-09-01
Full Text Available Though the Suzhou government has provided household solid waste (HSW source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residents’ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% of the respondents in Suzhou thought they knew how to source separate HSW, and 29% of them have source separated HSW accurately. The results also found that the current HSW source separation pilot program in Suzhou is valid, as HSW source separation facilities and residents’ separation behavior both became better and better along with the program implementation. The main determinants of residents’ HSW source separation behavior are residents’ age, HSW source separation facilities and government preferential policies. The accessibility to waste management service is particularly important. Attitudes and willingness do not have significant impacts on residents’ HSW source separation behavior.
Wood ash as a magnesium source for phosphorus recovery from source-separated urine.
Sakthivel, S Ramesh; Tilley, Elizabeth; Udert, Kai M
2012-03-01
Struvite precipitation is a simple technology for phosphorus recovery from source-separated urine. However, production costs can be high if expensive magnesium salts are used as precipitants. Therefore, waste products can be interesting alternatives to industrially-produced magnesium salts. We investigated the technical and financial feasibility of wood ash as a magnesium source in India. In batch experiments with source-separated urine, we could precipitate 99% of the phosphate with a magnesium dosage of 2.7 mol Mg mol P(-1). The availability of the magnesium from the wood ash used in our experiment was only about 50% but this could be increased by burning the wood at temperatures well above 600 °C. Depending on the wood ash used, the precipitate can contain high concentrations of heavy metals. This could be problematic if the precipitate were used as fertilizer depending on the applicable fertilizer regulations. The financial study revealed that wood ash is considerably cheaper than industrially-produced magnesium sources and even cheaper than bittern. However, the solid precipitated with wood ash is not pure struvite. Due to the high calcite and the low phosphorus content (3%), the precipitate would be better used as a phosphorus-enhanced conditioner for acidic soils. The estimated fertilizer value of the precipitate was actually slightly lower than wood ash, because 60% of the potassium dissolved into solution during precipitation and was not present in the final product. From a financial point of view and due to the high heavy metal content, wood ash is not a very suitable precipitant for struvite production. Phosphate precipitation from urine with wood ash can be useful if (1) a strong need for a soil conditioner that also contains phosphate exists, (2) potassium is abundant in the soil and (3) no other cheap precipitant, such as bittern or magnesium oxide, is available. PMID:22297249
A Bayesian Approach to Discovering Truth from Conflicting Sources for Data Integration
Zhao, Bo; Gemmell, Jim; Han, Jiawei
2012-01-01
In practical data integration systems, it is common for the data sources being integrated to provide conflicting information about the same entity. Consequently, a major challenge for data integration is to derive the most complete and accurate integrated records from diverse and sometimes conflicting sources. We term this challenge the truth finding problem. We observe that some sources are generally more reliable than others, and therefore a good model of source quality is the key to solving the truth finding problem. In this work, we propose a probabilistic graphical model that can automatically infer true records and source quality without any supervision. In contrast to previous methods, our principled approach leverages a generative process of two types of errors (false positive and false negative) by modeling two different aspects of source quality. In so doing, ours is also the first approach designed to merge multi-valued attribute types. Our method is scalable, due to an efficient sampling-based inf...
2014-01-01
Time series studies have suggested that air pollution can negatively impact health. These studies have typically focused on the total mass of fine particulate matter air pollution or the individual chemical constituents that contribute to it, and not source-specific contributions to air pollution. Source-specific contribution estimates are useful from a regulatory standpoint by allowing regulators to focus limited resources on reducing emissions from sources that are major cont...
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Chris von Borgstede; Maria Andersson; Ola Eriksson
2012-01-01
Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS). In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to empl...
Synthesis of blind source separation algorithms on reconfigurable FPGA platforms
Du, Hongtao; Qi, Hairong; Szu, Harold H.
2005-03-01
Recent advances in intelligence technology have boosted the development of micro- Unmanned Air Vehicles (UAVs) including Sliver Fox, Shadow, and Scan Eagle for various surveillance and reconnaissance applications. These affordable and reusable devices have to fit a series of size, weight, and power constraints. Cameras used on such micro-UAVs are therefore mounted directly at a fixed angle without any motion-compensated gimbals. This mounting scheme has resulted in the so-called jitter effect in which jitter is defined as sub-pixel or small amplitude vibrations. The jitter blur caused by the jitter effect needs to be corrected before any other processing algorithms can be practically applied. Jitter restoration has been solved by various optimization techniques, including Wiener approximation, maximum a-posteriori probability (MAP), etc. However, these algorithms normally assume a spatial-invariant blur model that is not the case with jitter blur. Szu et al. developed a smart real-time algorithm based on auto-regression (AR) with its natural generalization of unsupervised artificial neural network (ANN) learning to achieve restoration accuracy at the sub-pixel level. This algorithm resembles the capability of the human visual system, in which an agreement between the pair of eyes indicates "signal", otherwise, the jitter noise. Using this non-statistical method, for each single pixel, a deterministic blind sources separation (BSS) process can then be carried out independently based on a deterministic minimum of the Helmholtz free energy with a generalization of Shannon's information theory applied to open dynamic systems. From a hardware implementation point of view, the process of jitter restoration of an image using Szu's algorithm can be optimized by pixel-based parallelization. In our previous work, a parallelly structured independent component analysis (ICA) algorithm has been implemented on both Field Programmable Gate Array (FPGA) and Application
Korth, F.; Deutsch, B.; Frey, C.; Moros, C.; Voss, M.
2014-09-01
Nitrate (NO3-) is the major nutrient responsible for coastal eutrophication worldwide and its production is related to intensive food production and fossil-fuel combustion. In the Baltic Sea NO3- inputs have increased 4-fold over recent decades and now remain constantly high. NO3- source identification is therefore an important consideration in environmental management strategies. In this study focusing on the Baltic Sea, we used a method to estimate the proportional contributions of NO3- from atmospheric deposition, N2 fixation, and runoff from pristine soils as well as from agricultural land. Our approach combines data on the dual isotopes of NO3- (δ15N-NO3- and δ18O-NO3-) in winter surface waters with a Bayesian isotope mixing model (Stable Isotope Analysis in R, SIAR). Based on data gathered from 47 sampling locations over the entire Baltic Sea, the majority of the NO3- in the southern Baltic was shown to derive from runoff from agricultural land (33-100%), whereas in the northern Baltic, i.e. the Gulf of Bothnia, NO3- originates from nitrification in pristine soils (34-100%). Atmospheric deposition accounts for only a small percentage of NO3- levels in the Baltic Sea, except for contributions from northern rivers, where the levels of atmospheric NO3- are higher. An additional important source in the central Baltic Sea is N2 fixation by diazotrophs, which contributes 49-65% of the overall NO3- pool at this site. The results obtained with this method are in good agreement with source estimates based upon δ15N values in sediments and a three-dimensional ecosystem model, ERGOM. We suggest that this approach can be easily modified to determine NO3- sources in other marginal seas or larger near-coastal areas where NO3- is abundant in winter surface waters when fractionation processes are minor.
A full-scope method is constructed to reveal source term uncertainties and to identify influential inputs during a severe accident at a nuclear power plant (NPP). An integrated severe accident code, MELCOR Ver. 1.8.5, is used as a tool to simulate the accident similar to that occurred at Unit 2 of the Fukushima Daiichi NPP. In order to figure out how much radioactive materials are released from the containment to the environment during the accident, Monte Carlo based uncertainty analysis is performed. Generally, in order to evaluate the influence of uncertain inputs on the output, a large number of code runs are required in the global sensitivity analysis. To avoid the laborious computational cost for the global sensitivity analysis via MELCOR, a surrogate stochastic model is built using a Bayesian nonparametric approach, Dirichlet process. Probability distributions derived from uncertainty analysis using MELCOR and the stochastic model show good agreement. The appropriateness of the stochastic model is cross-validated through the comparison with MELCOR results. The importance measure of uncertain input variables are calculated according to their influences on the uncertainty distribution as first-order effect and total effect. The validity of the present methodology is demonstrated through an example with three uncertain input variables. - Highlights: • A method of source term uncertainty and sensitivity analysis is proposed. • Source term in Fukushima Daiichi NPP severe accident is demonstrated. • Uncertainty distributions of source terms show non-standard shapes. • A surrogate model for integrated code is constructed by using Dirichlet process. • Importance ranking of influential input variables is obtained
Source Separation and Composting of Organic Municipal Solid Waste.
Gould, Mark; And Others
1992-01-01
Describes a variety of composting techniques that may be utilized in a municipal level solid waste management program. Suggests how composting system designers should determine the amount and type of organics in the waste stream, evaluate separation approaches and assess collection techniques. Outlines the advantages of mixed waste composting and…
Lin Wang
2010-01-01
Full Text Available Frequency-domain blind source separation (BSS performs poorly in high reverberation because the independence assumption collapses at each frequency bins when the number of bins increases. To improve the separation result, this paper proposes a method which combines two techniques by using beamforming as a preprocessor of blind source separation. With the sound source locations supposed to be known, the mixed signals are dereverberated and enhanced by beamforming; then the beamformed signals are further separated by blind source separation. To implement the proposed method, a superdirective fixed beamformer is designed for beamforming, and an interfrequency dependence-based permutation alignment scheme is presented for frequency-domain blind source separation. With beamforming shortening mixing filters and reducing noise before blind source separation, the combined method works better in reverberation. The performance of the proposed method is investigated by separating up to 4 sources in different environments with reverberation time from 100 ms to 700 ms. Simulation results verify the outperformance of the proposed method over using beamforming or blind source separation alone. Analysis demonstrates that the proposed method is computationally efficient and appropriate for real-time processing.
Improving the perceptual quality of single-channel blind audio source separation.
Stokes, Tobias W.
2015-01-01
Given a mixture of audio sources, a blind audio source separation (BASS) tool is required to extract audio relating to one specific source whilst attenuating that related to all others. This thesis answers the question “How can the perceptual quality of BASS be improved for broadcasting applications?” The most common source separation scenario, particularly in the field of broadcasting, is single channel, and this is particularly challenging as a limited set of cues are available. Broadcas...
Sparse Reverberant Audio Source Separation via Reweighted Analysis
Arberet, Simon; Vandergheynst, Pierre; Carrillo, Rafael; Thiran, Jean-Philippe; Wiaux, Yves
2013-01-01
We propose a novel algorithm for source signals estimation from an underdetermined convolutive mixture assuming known mixing filters. Most of the state-of-the-art methods are dealing with anechoic or short reverberant mixture, assuming a synthesis sparse prior in the time-frequency domain and a narrowband approximation of the convolutive mixing process. In this paper, we address the source estimation of convolutive mixtures with a new algorithm based on i) an analysis sparse prior, ii) a rewe...
Bayesian Independent Component Analysis
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole; Hansen, Lars Kai
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and...
Single-channel source separation using non-negative matrix factorization
Schmidt, Mikkel Nørgaard
which a number of methods for single-channel source separation based on non-negative matrix factorization are presented. In the papers, the methods are applied to separating audio signals such as speech and musical instruments and separating different types of tissue in chemical shift imaging....
From Binaural to Multichannel Blind Source Separation using Fixed Beamforming with HRTFs
Maazaoui, Mounira; Grenier, Yves; Abed-Meraim, Karim
2012-01-01
In this article, we are interested in the problem of blind source separation (BSS) for the robot audition, we study the performance of blind source separation with a varying number of sensors in a microphone array placed in the head of an infant size dummy. We propose a two stage blind source separation algorithm based on a fixed beamforming preprocessing using the head related transfer functions (HRTF) of the dummy and a separation algorithm using a sparsity criterion. We show that in the ca...
30 CFR 57.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... NONMETAL MINES Explosives Electric Blasting-Surface and Underground § 57.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall...
30 CFR 56.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... MINES Explosives Electric Blasting § 56.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be locked in the open position...
Sound fields separation and reconstruction of irregularly shaped sources
Totaro, N.; Vigoureux, D.; Leclère, Q.; Lagneaux, J.; Guyader, J. L.
2015-02-01
Nowadays, the need of source identification methods is still growing and application cases are more and more complex. As a consequence, it is necessary to develop methods allowing us to reconstruct sound fields on irregularly shaped sources in reverberant or confined acoustic environment. The inverse Patch Transfer Functions (iPTF) method is suitable to achieve these objectives. Indeed, as the iPTF method is based on Green's identity and double measurements of pressure and particle velocity on a surface surrounding the source, it is independent of the acoustic environment. In addition, the finite element solver used to compute the patch transfer functions permits us to handle sources with 3D irregular shapes. In the present paper, two experimental applications on a flat plate and an oil pan have been carried out to show the performances of the method on real applications. As for all ill-posed problem, it is shown that the crucial point of this method is the choice of the parameter of the Tikhonov regularization, one of the most widely used in the literature. The classical L-curve strategy sometimes fails to choose the best solution. This issue is clearly explained and an adapted strategy combining L-curve and acoustic power conservation is proposed. The efficiency of this strategy is demonstrated on both applications and compared to results obtained with Generalized Cross Validation (GCV) technique.
Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan;
2005-01-01
A limitation in many source separation tasks is that the number of source signals has to be known in advance. Further, in order to achieve good performance, the number of sources cannot exceed the number of sensors. In many real-world applications these limitations are too strict. We propose a...... novel method for over-complete blind source separation. Two powerful source separation techniques have been combined, independent component analysis and binary time-frequency masking. Hereby, it is possible to iteratively extract each speech signal from the mixture. By using merely two microphones we...... can separate up to six mixed speech signals under anechoic conditions. The number of source signals is not assumed to be known in advance. It is also possible to maintain the extracted signals as stereo signals...
Quantum Rate Distortion, Reverse Shannon Theorems, and Source-Channel Separation
Datta, Nilanjana; Hsieh, Min-Hsiu; Wilde, Mark M.
2013-01-01
We derive quantum counterparts of two key theorems of classical information theory, namely, the rate distortion theorem and the source-channel separation theorem. The rate-distortion theorem gives the ultimate limits on lossy data compression, and the source-channel separation theorem implies that a two-stage protocol consisting of compression and channel coding is optimal for transmitting a memoryless source over a memoryless channel. In spite of their importance in the classical domain, the...
Student support and perceptions of urine source separation in a university community.
Ishii, Stephanie K L; Boyer, Treavor H
2016-09-01
Urine source separation, i.e., the collection and treatment of human urine as a separate waste stream, has the potential to improve many aspects of water resource management and wastewater treatment. However, social considerations must be taken into consideration for successful implementation of this alternative wastewater system. This work evaluated the perceptions of urine source separation held by students living on-campus at a major university in the Southeastern region of the United States. Perceptions were evaluated in the context of the Theory of Planned Behavior. The survey population represents one group within a community type (universities) that is expected to be an excellent testbed for urine source separation. Overall, respondents reported high levels of support for urine source separation after watching a video on expected benefits and risks, e.g., 84% indicated that they would vote in favor of urine source separation in residence halls. Support was less apparent when measured by willingness to pay, as 33% of respondents were unwilling to pay for the implementation of urine source separation and 40% were only willing to pay $1 to $10 per semester. Water conservation was largely identified as the most important benefit of urine source separation and there was little concern reported about the use of urine-based fertilizers. Statistical analyses showed that one's environmental attitude, environmental behavior, perceptions of support within the university community, and belief that student opinions have an impact on university decision makers were significantly correlated with one's support for urine source separation. This work helps identify community characteristics that lend themselves to acceptance of urine source separation, such as those related to environmental attitudes/behaviors and perceptions of behavioral control and subjective norm. Critical aspects of these alternative wastewater systems that require attention in order to foster public
Objective Bayesian analysis of counting experiments with correlated sources of background
Casadei, Diego
2015-01-01
Searches for faint signals in counting experiments are often encountered in particle physics and astrophysics, as well as in other fields. Many problems can be reduced to the case of a model with independent and Poisson distributed signal and background. Often several background contributions are present at the same time, possibly correlated. We provide the analytic solution of the statistical inference problem of estimating the signal in the presence of multiple backgrounds, in the framework of objective Bayes statistics. The model can be written in the form of a product of a single Poisson distribution with a multinomial distribution. The first is related to the total number of events, whereas the latter describes the fraction of events coming from each individual source. Correlations among different backgrounds can be included in the inference problem by a suitable choice of the priors.
Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm
Lei Chen
2014-01-01
Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.
Empirical Study on Factors Influencing Residents' Behavior of Separating Household Wastes at Source
Qu Ying; Zhu Qinghua; Murray Haight
2007-01-01
Source separation is the basic premise for making effective use of household wastes. In eight cities of China, however, several pilot projects of source separation finally failed because of the poor participation rate of residents. In order to solve this problem, identifying those factors that influence residents' behavior of source separation becomes crucial. By means of questionnaire survey, we conducted descriptive analysis and exploratory factor analysis. The results show that trouble-feeling, moral notion, environment protection, public education, environment value and knowledge deficiency are the main factors that play an important role for residents in deciding to separate their household wastes. Also, according to the contribution percentage of the six main factors to the total behavior of source separation, their influencing power is analyzed, which will provide suggestions on household waste management for policy makers and decision makers in China.
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Source Separation and Higher-Order Causal Analysis of MEG and EEG
Zhang, Kun
2012-01-01
Separation of the sources and analysis of their connectivity have been an important topic in EEG/MEG analysis. To solve this problem in an automatic manner, we propose a two-layer model, in which the sources are conditionally uncorrelated from each other, but not independent; the dependence is caused by the causality in their time-varying variances (envelopes). The model is identified in two steps. We first propose a new source separation technique which takes into account the autocorrelations (which may be time-varying) and time-varying variances of the sources. The causality in the envelopes is then discovered by exploiting a special kind of multivariate GARCH (generalized autoregressive conditional heteroscedasticity) model. The resulting causal diagram gives the effective connectivity between the separated sources; in our experimental results on MEG data, sources with similar functions are grouped together, with negative influences between groups, and the groups are connected via some interesting sources.
Monaural separation of dependent audio sources based on a generalized Wiener filter
Ma, Guilin; Agerkvist, Finn T.; Luther, J.B.
2007-01-01
) coefficients of the dependent sources is modeled by complex Gaussian mixture models in the frequency domain from samples of individual sources to capture the properties of the sources and their correlation. During the second stage, the mixture is separated through a generalized Wiener filter, which takes...
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. PMID:25819930
Lockhart, K.; Harter, T.; Grote, M.; Young, M. B.; Eppich, G.; Deinhart, A.; Wimpenny, J.; Yin, Q. Z.
2014-12-01
Groundwater quality is a concern in alluvial aquifers underlying agricultural areas worldwide, an example of which is the San Joaquin Valley, California. Nitrate from land applied fertilizers or from animal waste can leach to groundwater and contaminate drinking water resources. Dairy manure and synthetic fertilizers are the major sources of nitrate in groundwater in the San Joaquin Valley, however, septic waste can be a major source in some areas. As in other such regions around the world, the rural population in the San Joaquin Valley relies almost exclusively on shallow domestic wells (≤150 m deep), of which many have been affected by nitrate. Consumption of water containing nitrate above the drinking water limit has been linked to major health effects including low blood oxygen in infants and certain cancers. Knowledge of the proportion of each of the three main nitrate sources (manure, synthetic fertilizer, and septic waste) contributing to individual well nitrate can aid future regulatory decisions. Nitrogen, oxygen, and boron isotopes can be used as tracers to differentiate between the three main nitrate sources. Mixing models quantify the proportional contributions of sources to a mixture by using the concentration of conservative tracers within each source as a source signature. Deterministic mixing models are common, but do not allow for variability in the tracer source concentration or overlap of tracer concentrations between sources. Bayesian statistics used in conjunction with mixing models can incorporate variability in the source signature. We developed a Bayesian mixing model on a pilot network of 32 private domestic wells in the San Joaquin Valley for which nitrate as well as nitrogen, oxygen, and boron isotopes were measured. Probability distributions for nitrogen, oxygen, and boron isotope source signatures for manure, fertilizer, and septic waste were compiled from the literature and from a previous groundwater monitoring project on several
Source Separation and Clustering of Phase-Locked Subspaces: Derivations and Proofs
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José; Vigário, Ricardo
2011-01-01
Due to space limitations, our submission "Source Separation and Clustering of Phase-Locked Subspaces", accepted for publication on the IEEE Transactions on Neural Networks in 2011, presented some results without proof. Those proofs are provided in this paper.
Fate of pharmaceuticals in full-scale source separated sanitation system
Butkovskyi, A.; Hernandez Leal, L.; Rijnaarts, H.H.M.; Zeeman, G.
2015-01-01
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen
Phase recovery in NMF for audio source separation: an insightful benchmark
Magron, Paul; Badeau, Roland; David, Bertrand
2016-01-01
Nonnegative Matrix Factorization (NMF) is a powerful tool for decomposing mixtures of audio signals in the Time-Frequency (TF) domain. In applications such as source separation, the phase recovery for each extracted component is a major issue since it often leads to audible artifacts. In this paper, we present a methodology for evaluating various NMF-based source separation techniques involving phase reconstruction. For each model considered, a comparison between two approaches (blind separat...
Semi-Blind Source Separation in a Multi-User Transmission System with Interference Alignment
FADLALLAH, Yasser; AISSA EL BEY, Abdeldjalil; Abed-Meraim, Karim; AMIS CAVALEC, Karine; Pyndiah, Ramesh
2013-01-01
In this paper we address the decoding problem in the K-user MIMO interference channel assuming an interference alignment (IA) design. We aim to decode robustly the desired signal without having a full Channel State Information (CSI) (i.e. precoders knowledge) at the receivers. We show the equivalency between the IA model and the Semi-Blind Source Separation model (SBSS). Then, we prove that this equivalence allows the use of techniques employed in source separation for extracting the desired ...
A cost evaluation method for transferring municipalities to solid waste source-separated system.
Lavee, Doron; Nardiya, Shlomit
2013-05-01
Most of Israel's waste is disposed in landfills, threatening scarce land resources and posing environmental and health risks. The aim of this study is to estimate the expected costs of transferring municipalities to solid waste source separation in Israel, aimed at reducing the amount of waste directed to landfills and increasing the efficiency and amount of recycled waste. Information on the expected costs of operating a solid waste source separation system was gathered from 47 municipalities and compiled onto a database, taking into consideration various factors such as costs of equipment, construction adjustments and waste collection and disposal. This database may serve as a model for estimating the costs of entering the waste source separation system for any municipality in Israel, while taking into consideration its specific characteristics, such as size and region. The model was used in Israel for determining municipalities' eligibility to receive a governmental grant for entering an accelerated process of solid waste source separation. This study displays a user-friendly and simple operational tool for assessing municipalities' costs of entering a process of waste source separation, providing policy makers a powerful tool for diverting funds effectively in promoting solid waste source separation. PMID:23465315
A Modified Infomax ICA Algorithm for fMRI Data Source Separation
Amir A. Khaliq
2013-05-01
Full Text Available This study presents a modified infomax model of Independent Component Analysis (ICA for the source separation problem of fMRI data. Functional MRI data is processed by different blind source separation techniques including Independent Component Analysis (ICA. ICA is a statistical decomposition method used for multivariate data source separation. ICA algorithm is based on independence of extracted sources for which different techniques are used like kurtosis, negentropy, information maximization etc. The infomax method of ICA extracts unknown sources from a number of mixtures by maximizing the negentropy thus ensuring independence. In this proposed modified infomax model a higher order contrast function is used which results in fast convergence and accuracy. The Proposed algorithm is applied to general simulated signals and simulated fMRI data. Comparison of correlation results of the proposed algorithm with the conventional infomax algorithm shows better performance.
Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices
Baccigalupi C
2005-01-01
Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.
Bamber, J. L.; Schoen, N.; Zammit-Mangion, A.; Rougier, J.; Flament, T.; Luthcke, S. B.; Petrie, E. J.; Rémy, F.
2013-12-01
There remains considerable inconsistency between different methods and approaches for determining ice mass trends for Antarctica from satellite observations. There are three approaches that can provide near global coverage for mass trends: altimetry, gravimetry and mass budget calculations. All three approaches suffer from a source separation problem where other geophysical processes limit the capability of the method to resolve the origin and magnitude of a mass change. A fourth approach, GPS vertical motion, provides localised estimates of mass change due to elastic uplift and an indirect estimate of GIA. Each approach has different source separation issues and different spatio-temporal error characteristics. In principle, it should be possible to combine the data and process covariances to minimize the uncertainty in the solution and to produce robust, posterior errors for the trends. In practice, this is a challenging problem in statistics because of the large number of degrees of freedom, the variable spatial and temporal sampling between the different observations and the fact that some processes remain under-sampled, such as firn compaction. Here, we present a novel solution to this problem using the latest methods in statistical modelling of spatio-temporal processes. We use Bayesian hierarchical modelling and employ stochastic partial differential equations to capture our physical understanding of the key processes that influence our observations. Due to the huge number of observations involved (> 10^8) methods are required to reduce the dimensionality of the problem and care is required in treatment of the observations as they are not independent. Here, we focus mainly on the results rather than the full suite of methods and we present time evolving fields of surface mass balance, ice dynamic-driven mass loss, and firn compaction for the period 2003-2009, derived from a combination of ICESat, ENVISAT, GRACE, InSAR, GPS and regional climate model output
Y. Yokoo
2014-01-01
This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring ...
Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study
Cees Buisman
2013-07-01
Full Text Available Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1 centralized; (2 centralized with source-separation of urine; (3 source-separation of black water, kitchen refuse and grey water; and (4 source-separation of urine, feces, kitchen refuse and grey water. The highest primary energy consumption of 914 MJ/capita(cap/year was attained within the centralized sanitation concept, and the lowest primary energy consumption of 437 MJ/cap/year was attained within source-separation of urine, feces, kitchen refuse and grey water. Grey water bio-flocculation and subsequent grey water sludge co-digestion decreased the primary energy consumption, but was not energetically favorable to couple with grey water effluent reuse. Source-separation of urine improved the energy balance, nutrient recovery and effluent quality, but required larger land area and higher chemical use in the centralized concept.
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
The single staged ECR source at the TRIUMF isotope separator TISOL
With its installation at the isotope separator TISOL the single staged ECR source has now become operational in delivering radioactive species extracted from the production target which is bombarded by 500 MeV protons. Among the radioactive species detected so far are He, C, N, Ne, Cl, Ar, Kr and Xe. The dependency of the ion currents/efficiencies on several source parameters is discussed as well as the technical difficulties in connecting ECR sources to on-line isotope separators. (Author) (5 refs., tab., 6 figs.)
Applying the Background-Source separation algorithm to Chandra Deep Field South data
Guglielmetti, F; Fischer, R; Rosati, P; Tozzi, P
2012-01-01
A probabilistic two-component mixture model allows one to separate the diffuse background from the celestial sources within a one-step algorithm without data censoring. The background is modeled with a thin-plate spline combined with the satellite's exposure time. Source probability maps are created in a multi-resolution analysis for revealing faint and extended sources. All detected sources are automatically parametrized to produce a list of source positions, fluxes and morphological parameters. The present analysis is applied to the Chandra Deep Field South 2 Ms public released data. Within its 1.884 ks of exposure time and its angular resolution (0.984 arcsec), the Chandra Deep Field South data are particularly suited for testing the Background-Source separation algorithm.
Reverberant Audio Source Separation via Sparse and Low-Rank Modeling
Arberet, Simon; Vandergheynst, Pierre
2013-01-01
The performance of audio source separation from underde- termined convolutive mixture assuming known mixing filters can be significantly improved by using an analysis sparse prior optimized by a reweighting l1 scheme and a wideband data- fidelity term, as demonstrated by a recent article. In this letter, we show that the performance can be improved even more significantly by exploiting a low-rank prior on the source spectrograms. We present a new algorithm to estimate the sources based on i) ...
Non-Stationary Brain Source Separation for Multi-Class Motor Imagery
Gouy-Pailler, Cedric; Congedo, Marco; Brunner, Clemens; Jutten, Christian; Pfurtscheller, Gert
2010-01-01
International audience This article describes a method to recover taskrelated brain sources in the context of multi-class Brain- Computer Interfaces (BCIs) based on non-invasive electroencephalography (EEG). We extend the method Joint Approximate Diagonalization (JAD) for spatial filtering using a maximum likelihood framework. This generic formulation (1) bridges the gap between the Common Spatial Patterns (CSP) and Blind Source Separation (BSS) of non-stationary sources, and (2) leads to ...
ICAR, a tool for Blind Source Separation using Fourth Order Statistics only
Albera, Laurent; Férreol, Anne; Chevalier, Pascal; Comon, Pierre
2005-01-01
The problem of blind separation of overdetermined mixtures of sources, that is, with fewer sources than (or as many sources as) sensors, is addressed in this paper. A new method, named ICAR (Independent Component Analysis using Redundancies in the quadricovariance), is proposed in order to process complex data. This method, without any whitening operation, only exploits some redundancies of a particular quadricovariance matrix of the data. Computer simulations demonstrate that ICAR offers in ...
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as well as the sources with their respective uncertainties. Background estimation and source detection is achieved in a single algorithm. A large variety of source morphologies is revealed. The technique is applied in the X-ray part of the electromagnetic spectrum on ROSAT and Chandra data sets and it is under a feasibility study for the forthcoming eROSITA mission.
Municipal solid waste source-separated collection in China: A comparative analysis
A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.
Rifai Chai; Naik, Ganesh R; Tran, Yvonne; Sai Ho Ling; Craig, Ashley; Nguyen, Hung T
2015-08-01
An electroencephalography (EEG)-based counter measure device could be used for fatigue detection during driving. This paper explores the classification of fatigue and alert states using power spectral density (PSD) as a feature extractor and fuzzy swarm based-artificial neural network (ANN) as a classifier. An independent component analysis of entropy rate bound minimization (ICA-ERBM) is investigated as a novel source separation technique for fatigue classification using EEG analysis. A comparison of the classification accuracy of source separator versus no source separator is presented. Classification performance based on 43 participants without the inclusion of the source separator resulted in an overall sensitivity of 71.67%, a specificity of 75.63% and an accuracy of 73.65%. However, these results were improved after the inclusion of a source separator module, resulting in an overall sensitivity of 78.16%, a specificity of 79.60% and an accuracy of 78.88% (p <; 0.05). PMID:26736312
Bao, Le; Raftery, Adrian E.; Reddy, Amala
2015-01-01
In most countries in the world outside of sub-Saharan Africa, HIV is largely concentrated in sub-populations whose behavior puts them at higher risk of contracting and transmitting HIV, such as people who inject drugs, sex workers and men who have sex with men. Estimating the size of these sub-populations is important for assessing overall HIV prevalence and designing effective interventions. We present a Bayesian hierarchical model for estimating the sizes of local and national HIV key affec...
Gran, Bjørn Axel
2002-01-01
The objective of the research has been to investigate the possibility to transfer the requirements of a software safety standard into Bayesian belief networks (BBNs). The BBN methodology has mainly been developed and applied in the AI society, but more recently it has been proposed to apply it to the assessment of programmable systems. The relation to AI application is relevant in the sense that the method reflects the way of an assessor's thinking during the assessment process. Conceptually,...
Yunhan Luo; Houxin Cui; Xiaoyu Gu; Rong Liu; Kexin Xu
2005-01-01
Based on analysis of the relation between mean penetration depth and source-detector separation in a threelayer model with the method of Monte-Carlo simulation, an optimal source-detector separation is derived from the mean penetration depth referring to monitoring the change of chromophores concentration of the sandwiched layer. In order to verify the separation, we perform Monte-Carlo simulations with varied absorption coefficient of the sandwiched layer. All these diffuse reflectances are used to construct a calibration model with the method of partial least square (PLS). High correlation coefficients and low root mean square error of prediction (RMSEP) at the optimal separation have confirmed correctness of the selection. This technique is expected to show light on noninvasive diagnosis of near-infrared spectroscopy.
Life cycle assessment of grain production using source-separated human urine and mineral fertiliser
Tidåker, Pernilla
2003-01-01
Source-separation of human urine is one promising technique for closing the nutrient cycle, reducing nutrient discharge and increasing energy efficiency. Separated urine can be used as a valuable fertiliser in agriculture, replacing mineral fertiliser. However, a proper handling of the urine at farm level is crucial for the environmental performance of the whole system. This study started from an agricultural point of view, demonstrating how grain production systems using human urine might be...
Subband-based Single-channel Source Separation of Instantaneous Audio Mixtures
Taghia, Jalil; Doostari, Mohammad Ali
2009-01-01
In this paper, a new algorithm is developed to separate the audio sources from a single instantaneous mixture. The algorithm is based on subband decomposition and uses a hybrid system of Empirical Mode Decomposition (EMD) and Principle Component Analysis (PCA) to construct artificial observations from the single mixture. In the separation stage of algorithm, we use Independent Component Analysis (ICA) to find independent components. At first the observed mixture is divided into a finite numbe...
Larsen, Anna Warberg; Astrup, Thomas
2011-01-01
variations between emission factors for different incinerators, but the background for these variations has not been thoroughly examined. One important reason may be variations in collection of recyclable materials as source separation alters the composition of the residual waste incinerated. The objective......CO2-loads from combustible waste are important inputs for national CO2 inventories and life-cycle assessments (LCA). CO2 emissions from waste incinerators are often expressed by emission factors in kg fossil CO2 emitted per GJ energy content of the waste. Various studies have shown considerable...... of this study was to quantify the importance of source separation for determination of emission factors for incineration of residual household waste. This was done by mimicking various source separation scenarios and based on waste composition data calculating resulting emission factors for residual...
Blind source separation of ship-radiated noise based on generalized Gaussian model
Kong Wei; Yang Bin
2006-01-01
When the distribution of the sources cannot be estimated accurately, the ICA algorithms failed to separate the mixtures blindly. The generalized Gaussian model (GGM) is presented in ICA algorithm since it can model nonGaussian statistical structure of different source signals easily. By inferring only one parameter, a wide class of statistical distributions can be characterized. By using maximum likelihood (ML) approach and natural gradient descent, the learning rules of blind source separation (BSS) based on GGM are presented. The experiment of the ship-radiated noise demonstrates that the GGM can model the distributions of the ship-radiated noise and sea noise efficiently, and the learning rules based on GGM gives more successful separation results after comparing it with several conventional methods such as high order cumulants and Gaussian mixture density function.
Semi-blind Source Separation Using Head-Related Transfer Functions
Pedersen, Michael Syskind; Hansen, Lars Kai; Kjems, Ulrik; Rasmussen, Karsten Bo
An online blind source separation algorithm which is a special case of the geometric algorithm by Parra and Fancourt has been implemented for the purpose of separating sounds recorded at microphones placed at each side of the head. By using the assumption that the position of the two sounds are k...... the separation with approximately 1 dB compared to when free-field is assumed. This indicates that the permutation ambiguity is solved more accurate compared to when free-field is assumed....
FPGA-based real-time blind source separation with principal component analysis
Wilson, Matthew; Meyer-Baese, Uwe
2015-05-01
Principal component analysis (PCA) is a popular technique in reducing the dimension of a large data set so that more informed conclusions can be made about the relationship between the values in the data set. Blind source separation (BSS) is one of the many applications of PCA, where it is used to separate linearly mixed signals into their source signals. This project attempts to implement a BSS system in hardware. Due to unique characteristics of hardware implementation, the Generalized Hebbian Algorithm (GHA), a learning network model, is used. The FPGA used to compile and test the system is the Altera Cyclone III EP3C120F780I7.
A treatment of EEG data by underdetermined blind source separation for motor imagery classification
Koldovský, Zbyněk; Phan, A. H.; Tichavský, Petr; Cichocki, A.
Bucharest: EURASIP, 2012, s. 1484-1488. ISBN 978-1-4673-1068-0. ISSN 2076-1465. [20th European Signal Processing Conference (EUSIPCO 2012). Bukurešť (RO), 27.08.2012-31.08.2012] Grant ostatní: GA ČR(CZ) GAP103/11/1947 Institutional support: RVO:67985556 Keywords : electroencephalogram * brain-computer Interface * underdetermined blind source separation Subject RIV: FH - Neurology http://library.utia.cas.cz/separaty/2012/SI/tichavsky-a treatment of eeg data by underdetermined blind source separation for motor imagery classification.pdf
黄晋英; 潘宏侠; 毕世华; 杨喜旺
2008-01-01
Blind source separation (BBS) technology was applied to vibration signal processing of gearbox for separating different fault vibration sources and enhancing fault information. An improved BSS algorithm based on particle swarm optimization (PSO) was proposed. It can change the traditional fault-enhancing thought based on de-noising. And it can also solve the practical difficult problem of fault location and low fault diagnosis rate in early stage. It was applied to the vibration signal of gearbox under three working states. The result proves that the BSS greatly enhances fault information and supplies technological method for diagnosis of weak fault.
Blind Source Separation with Conjugate Gradient Algorithm and Kurtosis Maximization Criterion
Sanjeev N Jain
2016-02-01
Full Text Available Blind source separation (BSS is a technique for estimating individual source components from their mixtures at multiple sensors. It is called blind because any additional other information will not be used besides the mixtures. Recently, blind source separation has received attention because of its potential applications in signal processing such as in speech recognition systems, telecommunications and medical signal processing. Blind source separation of super and sub-Gaussian Signal is proposed utilizing conjugate gradient algorithm and kurtosis maximization criteria. In our previous paper, ABC algorithm was utilized to blind source separation and here, we improve the technique with changes in fitness function and scout bee phase. Fitness function is improved with the use of kurtosis maximization criterion and scout bee phase is improved with use of conjugate gradient algorithm. The evaluation metrics used for performance evaluation are fitness function values and distance values. Comparative analysis is also carried out by comparing our proposed technique to other prominent techniques. The technique achieved average distance of 38.39, average fitness value of 6.94, average Gaussian distance of 58.60 and average Gaussian fitness as 5.02. The technique attained lowest average distance value among all techniques and good values for all other evaluation metrics which shows the effectiveness of the proposed technique.
Separation of beam and electrons in the spallation neutron source H- ion source
The Spallation Neutron Source (SNS) requires an ion source producing an H- beam with a peak current of 35 mA at a 6.2% duty factor. For the design of this ion source, extracted electrons must be transported and dumped without adversely affecting the H- beam optics. Two issues are considered: (1) electron containment transport and controlled removal; and (2) first-order H- beam steering. For electron containment, various magnetic, geometric and electrode biasing configurations are analyzed. A kinetic description for the negative ions and electrons is employed with self-consistent fields obtained from a steady-state solution to Poisson's equation. Guiding center electron trajectories are used when the gyroradius is sufficiently small. The magnetic fields used to control the transport of the electrons and the asymmetric sheath produced by the gyrating electrons steer the ion beam. Scenarios for correcting this steering by split acceleration and focusing electrodes will be considered in some detail
Multichannel audio signal source separation based on an Interchannel Loudness Vector Sum
Park, Taejin; Lee, Taejin
2015-01-01
In this paper, a Blind Source Separation (BSS) algorithm for multichannel audio contents is proposed. Unlike common BSS algorithms targeting stereo audio contents or microphone array signals, our technique is targeted at multichannel audio such as 5.1 and 7.1ch audio. Since most multichannel audio object sources are panned using the Inter-channel Loudness Difference (ILD), we employ the ILVS (Inter-channel Loudness Vector Sum) concept to cluster common signals (such as background music) from ...
On-line isotope separation. Tests for targets and ion sources compatibility
We have performed a compilation of the influence of various parameters on suitable targets (composition, structure and nuclear constraint) for fission and spallation reactions induced by charged particles. In that case, targets are generally located near or inside the ionization chamber. A survey of typical ions sources and separators particularly used with heavy ion beams is given. These sources are often feeded either by a helium jet transport system or by a catcher foil
Carabias Orti, Julio J; Cobos, M??ximo; Vera Candeas, Pedro; Rodr??guez Serrano, Francisco J
2013-01-01
Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge am...
Ahuja, Chaitanya; Nathwani, Karan; Rajesh M. Hegde
2014-01-01
Conventional NMF methods for source separation factorize the matrix of spectral magnitudes. Spectral Phase is not included in the decomposition process of these methods. However, phase of the speech mixture is generally used in reconstructing the target speech signal. This results in undesired traces of interfering sources in the target signal. In this paper the spectral phase is incorporated in the decomposition process itself. Additionally, the complex matrix factorization problem is reduce...
Takuya Isomura
2015-12-01
Full Text Available Blind source separation is the computation underlying the cocktail party effect--a partygoer can distinguish a particular talker's voice from the ambient noise. Early studies indicated that the brain might use blind source separation as a signal processing strategy for sensory perception and numerous mathematical models have been proposed; however, it remains unclear how the neural networks extract particular sources from a complex mixture of inputs. We discovered that neurons in cultures of dissociated rat cortical cells could learn to represent particular sources while filtering out other signals. Specifically, the distinct classes of neurons in the culture learned to respond to the distinct sources after repeating training stimulation. Moreover, the neural network structures changed to reduce free energy, as predicted by the free-energy principle, a candidate unified theory of learning and memory, and by Jaynes' principle of maximum entropy. This implicit learning can only be explained by some form of Hebbian plasticity. These results are the first in vitro (as opposed to in silico demonstration of neural networks performing blind source separation, and the first formal demonstration of neuronal self-organization under the free energy principle.
Yuan, Yalin; Yabe, Mitsuyasu
2014-01-01
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly. PMID:25546279
Yalin Yuan
2014-12-01
Full Text Available A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
Co-Parenting: Sharing Your Child Equally. A Source Book for the Separated or Divorced Family.
Galper, Miriam
This source book introduces perspectives and skills which can contribute to successful "co-parenting" (joint custody, joint parenting, co-custody or shared custody) of preadolescent children after parents are separated or divorced. Chapter One introduces the concept of co-parenting. Chapter Two advances an approach to developing flexible…
Resource recovery from source separated domestic waste(water) streams; Full scale results
Zeeman, G.; Kujawa, K.
2011-01-01
A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation
Micropollutant removal in an algal treatment system fed with source separated wastewater streams
Wilt, de H.A.; Butkovskyi, A.; Tuantet, K.; Hernandez Leal, L.; Fernandes, T.; Langenhoff, A.A.M.; Zeeman, G.
2016-01-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceu
Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions
Zhang Yimin
2006-01-01
Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.
ENVIRONMENTAL EFFICIENCY, SEPARABILITY AND ABATEMENT COSTS OF NON-POINT SOURCE POLLUTION
Wossink, Ada; Denaux, Zulal Sogutlu
2002-01-01
This paper presents a new framework for analyzing abatement costs of nonpoint-source pollution. Unlike previous studies, this framework treats production and pollution as non-separable and also recognizes that production inefficiency is a fundamental cause of pollution. The implications of this approach are illustrated using an empirical analysis for cotton producers.
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Yang Junan; Li Bin; Zhuang Zhenquan
2003-01-01
This letter proposes two algorithms: a novel Quantum Genetic Algorithm (QGA)based on the improvement of Han's Genetic Quantum Algorithm (GQA) and a new Blind Source Separation (BSS) method based on QGA and Independent Component Analysis (ICA). The simulation result shows that the efficiency of the new BSS method is obviously higher than that of the Conventional Genetic Algorithm (CGA).
Blind separation of sources in nonlinear convolved mixture based on a novel network
胡英; 杨杰; 沈利
2004-01-01
Blind separation of independent sources from their nonlinear convoluted mixtures is a more realistic problem than from linear ones. A solution to this problem based on the Entropy Maximization principle is presented. First we propose a novel two-layer network as the de-mixing system to separate sources in nonlinear convolved mixture. In output layer of our network we use feedback network architecture to cope with convoluted mixtures. Then we derive learning algorithms for the two-layer network by maximizing the information entropy. Based on the comparison of the computer simulation results, it can be concluded that the proposed algorithm has a better nonlinear convolved blind signal separation effect than the H.H. Y' s algorithm.
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2013-01-01
The environmental performance of two pretreatment technologies for source-separated organic waste was compared using life cycle assessment (LCA). An innovative pulping process where source-separated organic waste is pulped with cold water forming a volatile solid rich biopulp was compared to a more...... including a number of non-toxic and toxic impact categories were assessed. No big difference in the overall performance of the two technologies was observed. The difference for the separate life cycle steps was, however, more pronounced. More efficient material transfer in the scenario with waste pulping...... resulted in a higher biogas output and nutrient recovery and, thus, the higher impact savings related to biogas production and digest utilization. Meanwhile, larger reject amount in the scenario with screw press led to more savings obtained by utilization of the reject in this scenario....
Difficulties applying recent blind source separation techniques to EEG and MEG
Knuth, Kevin H
2015-01-01
High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...
Non-parametric Bayesian models of response function in dynamic image sequences
Tichý, Ondřej; Šmídl, Václav
-, - (2016). ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.540, year: 2014 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf
Public opinion about the source separation of municipal solid waste in Shanghai, China.
Zhang, Weiqian; Che, Yue; Yang, Kai; Ren, Xiangyu; Tai, Jun
2012-12-01
For decades the generation of municipal solid waste (MSW) in Shanghai has been increasing. Despite the long-time efforts aimed at MSW management (MSWM), the disposal of MSW achieves poor performance. Thus, a MSW minimisation plan for Shanghai was proposed in December 2010. In this study, direct face-to-face interviews and a structured questionnaire survey were used in four different Shanghai community types. We conducted an econometric analysis of the social factors that influence the willingness to pay for MSW separation and discussed the household waste characteristics, the daily waste generation and the current treatment of kitchen wastes. The results suggested that the respondents are environmentally aware of separation, but only practise minimal separation. Negative neighbour effects, confused classification of MSW, and mixed transportation and disposal are the dominant limitations of MSW source-separated collection. Most respondents are willing to pay for MSWM. Public support is influenced by household population, income and cost. The attitudes and behaviours of citizens are important for reducing the amount of MSW disposal by 50% per capita by 2020 (relative to 2010). Concerted efforts should be taken to enlarge pilot areas. In addition, the source separation of kitchen wastes should be promoted. PMID:23045226
Gang Tang
2016-06-01
Full Text Available In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method.
Tang, Gang; Luo, Ganggang; Zhang, Weihua; Yang, Caijin; Wang, Huaqing
2016-01-01
In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method. PMID:27322268
Yan, Jun; Dong, Danan; Chen, Wen
2016-04-01
Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.
Blind Source Separation for Robot Audition using Fixed Beamforming with HRTFs
Maazaoui, Mounira; Grenier, Yves; Abed-Meraim, Karim
2011-01-01
We present a two stage blind source separation (BSS) algorithm for robot audition. The algorithm is based on a beamforming preprocessing and a BSS algorithm using a sparsity separation criterion. Before the BSS step, we filter the sensors outputs by beamforming filters to reduce the reverberation and the environmental noise. As we are in a robot audition context, the manifold of the sensor array in this case is hard to model, so we use pre-measured Head Related Transfer Functions (HRTFs) to e...
Wang, Fa-Yu; Chi, Chong-Yung; Chan, Tsung-Han; Wang, Yue
2010-05-01
Although significant efforts have been made in developing nonnegative blind source separation techniques, accurate separation of positive yet dependent sources remains a challenging task. In this paper, a joint correlation function of multiple signals is proposed to reveal and confirm that the observations after nonnegative mixing would have higher joint correlation than the original unknown sources. Accordingly, a new nonnegative least-correlated component analysis (n/LCA) method is proposed to design the unmixing matrix by minimizing the joint correlation function among the estimated nonnegative sources. In addition to a closed-form solution for unmixing two mixtures of two sources, the general algorithm of n/LCA for the multisource case is developed based on an iterative volume maximization (IVM) principle and linear programming. The source identifiability and required conditions are discussed and proven. The proposed n/LCA algorithm, denoted by n/LCA-IVM, is evaluated with both simulation data and real biomedical data to demonstrate its superior performance over several existing benchmark methods. PMID:20299711
Development of the high temperature ion-source for the Grenoble electromagnetic isotope separator
The production of high purity stable or radioactive isotopes (≥ 99.99 per cent) using electromagnetic separation require for equipment having a high resolving power. Besides, and in order to collect rare or short half-life isotopes, the efficiency of the ion-source must be high (η > 5 to 10 per cent). With this in view, the source built operates at high temperatures (2500-3000 C) and makes use of ionisation by electronic bombardment or of thermo-ionisation. A summary is given in the first part of this work on the essential characteristics of the isotope separator ion Sources; a diagram of the principle of the source built is then given together with its characteristics. In the second part are given the values of the resolving power and of the efficiency of the Grenoble isotope separator fitted with such a source. The resolving power measured at 10 per cent of the peak height is of the order of 200. At the first magnetic stage the efficiency is between 1 and 26 per cent for a range of elements evaporating between 200 and 3000 C. Thus equipped, the separator has for example given, at the first stage, 10 mg of 180Hf at (99.69 ± 0.1) per cent corresponding to an enrichment coefficient of 580; recently 2 mg of 150Nd at (99.996 ± 0.002) per cent corresponding to an enrichment coefficient of 4.2 x 105 has been obtained at the second stage. (author)
Secondary sources of uranium include materials from which it is uneconomical to extract it as the main product using currently available technologies. Such sources are generated as co-product or by-product of processing feed materials for products other than uranium. The secondary sources can include industrial solid or liquid streams in which uranium concentration may be low, but in view of large amounts of feed-stock, the quantity of uranium recoverable could be significant. Examples include sedimentary phosphates (as well as products derived therefrom), coal ash, niobium-tantalum slag and even sea water. Monazite is a phosphatic secondary source where uranium is obtainable as a by-product of production of rare earths and thorium. The term secondary source also includes solid residues, slag, scraps etc generated as a waste product of fuel fabrication facilities. It also includes contaminated sites and equipment from conventional uranium mills that need to be decontaminated and decommissioned. In some of the secondary sources, it is possible that the concentration of uranium can be fairly high, but the processing is constrained by the complexity of the host matrix or the chemical form of uranium or presence of other elements. Recovery of uranium from secondary sources is an eco-friendly process as it serves to isolate the uranium from the environment and the future generations are thereby spared the burden of caring for such materials. It is possible that for many of the secondary sources, the concentration of uranium is below the safe limit set by currently applicable regulations. However the collective societal dose integrated over the long exposure times associated with the long half-life of uranium can be significant. In accordance with the 'as low as reasonably achievable' (ALARA) principle of radiation protection, uranium separation is desirable as a 'green' activity, This has been acknowledged in IAEA documents on the long term uranium supplies, which also
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Y. Yokoo
2014-09-01
Full Text Available This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring station on the Abukuma River, Japan. The results of the methods corresponded well in terms of both surface flow components and inter-flow components. In terms of the baseflow component, the result of the time source separation method corresponded with the moving average of the baseflow calculated by the geographic source separation method. These results suggest that the time source separation method is not only able to estimate numerical values for the discharge components, but that the estimates are also reasonable from a geographical viewpoint in the 3000 km2 watershed discussed in this study. The consistent results obtained using the time source and geographic source separation methods demonstrate that it is possible to characterize dominant runoff processes using hourly discharge data, thereby enhancing our capability to interpret the dominant runoff processes of a watershed using observed discharge data alone.
Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA
You Rong-Yi; Chen Zhong
2005-01-01
Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.
Frequency Domain Blind Source Separation for Robot Audition Using a Parameterized Sparsity Criterion
Abed-Meraim, Karim; Grenier, Y.; Maazaoui, Mounira
2011-01-01
In this paper, we introduce a modified lp norm blind source separation criterion based on the source sparsity in the timefrequency domain. We study the effect of making the sparsity constraint harder through the optimization process, making the parameter p of the lp norm vary from 1 to nearly 0 according to a sigmoid function. The sigmoid introduces a smooth lp norm variation which avoids the divergence of the algorithm. We compared this algorithm to the regular l1 norm minimization and an IC...
Šembera, Ondřej; Tichavský, Petr; Koldovský, Z.
Piscataway: IEEE, 2016, s. 4323-4327. ISBN 978-1-4799-9987-3. [IEEE International Conference on Acoustics, Speech, and Signal Processing 2016 (ICASSP2016). Shanghai (CN), 20.03.2016-25.03.2016] R&D Projects: GA ČR(CZ) GA14-13713S Institutional support: RVO:67985556 Keywords : Autoregressive Processes * Cramer-Rao Bound * Blind Source Separation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2016/SI/tichavsky-0458485.pdf
Estimating International Tourism Demand to Spain Separately by the Major Source Markets
Marcos Alvarez-Díaz; Manuel González-Gómez; Mª Soledad Otero-Giraldez
2012-01-01
The objective of this paper is to estimate international tourism demand to Spain separately by major source markets (Germany, United Kingdom, France, Italy and The Netherlands) that represent 67% of the international tourism to Spain. In order to investigate how the tourism demand reacts to price and income changes, we apply the bounds testing approach to cointegration and construct confidence intervals using the bootstrap technique. The results show differences in tourism behavior depending ...
Blind Source Separation Based of Brain Computer Interface System: A review
Ahmed Kareem Abdullah; Zhang Chao Zhu
2014-01-01
This study reviews the originality and development of the Brain Computer Interface (BCI) system and focus on the BCI system design based on Blind Source Separation (BSS) techniques. The study also provides the recent trends and discusses some of a new ideas for BSS techniques in BCI architecture, articles which discussing the BCI system development were analysed, types of the BCI systems and the recent BCI design were explored. Since 1970 when the research of BCI system began in the Californi...
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2006-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information max...
Congedo, Marco; Gouy-Pailler, Cedric; Jutten, Christian
2008-01-01
Over the last ten years blind source separation (BSS) has become a prominent processing tool in the study of human electroencephalography (EEG). Without relying on head modeling BSS aims at estimating both the waveform and the scalp spatial pattern of the intracranial dipolar current responsible of the observed EEG. In this review we begin by placing the BSS linear instantaneous model of EEG within the framework of brain volume conduction theory. We then review the concept and current practic...
Industrial applications of extended output-only Blind Source Separation techniques
Rutten, Christophe; Nguyen, Viet Ha; Golinval, Jean-Claude
2011-01-01
In the field of structural health monitoring or machine condition moni-toring, most vibration based methods reported in the literature require to measure responses at several locations on the structure. In machine condition monitoring, the number of available vibration sensors is often small and it is not unusual that only one single sensor is used to monitor a machine. This paper presents industrial applications of two possible extensions of output-only Blind Source Separation (BSS) techniqu...
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
YangJunan; LiBin; 等
2003-01-01
This letter proposes two algorithns:a novel Quantum Genetic Algorithm(QGA)based on the improvement of Han's Genetic Quantum Algorithm(GQA)and a new Blind Source Separation(BSS)method based on QGA and Independent Component Analysis(ICA).The simulation result shows that the efficiency of the new BSS nethod is obviously higher than that of the Conventional Genetic Algorithm(CGA).
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Role of the source to building lateral separation distance in petroleum vapor intrusion
Verginelli, Iason; Capobianco, Oriana; Baciocchi, Renato
2016-06-01
The adoption of source to building separation distances to screen sites that need further field investigation is becoming a common practice for the evaluation of the vapor intrusion pathway at sites contaminated by petroleum hydrocarbons. Namely, for the source to building vertical distance, the screening criteria for petroleum vapor intrusion have been deeply investigated in the recent literature and fully addressed in the recent guidelines issued by ITRC and U.S.EPA. Conversely, due to the lack of field and modeling studies, the source to building lateral distance received relatively low attention. To address this issue, in this work we present a steady-state vapor intrusion analytical model incorporating a piecewise first-order aerobic biodegradation limited by oxygen availability that accounts for lateral source to building separation. The developed model can be used to evaluate the role and relevance of lateral vapor attenuation as well as to provide a site-specific assessment of the lateral screening distances needed to attenuate vapor concentrations to risk-based values. The simulation outcomes showed to be consistent with field data and 3-D numerical modeling results reported in previous studies and, for shallow sources, with the screening criteria recommended by U.S.EPA for the vertical separation distance. Indeed, although petroleum vapors can cover maximum lateral distances up to 25-30 m, as highlighted by the comparison of model outputs with field evidences of vapor migration in the subsurface, simulation results by this new model indicated that, regardless of the source concentration and depth, 6 m and 7 m lateral distances are sufficient to attenuate petroleum vapors below risk-based values for groundwater and soil sources, respectively. However, for deep sources (> 5 m) and for low to moderate source concentrations (benzene concentrations lower than 5 mg/L in groundwater and 0.5 mg/kg in soil) the above criteria were found extremely conservative as
Biollaz, S.; Ludwig, Ch.; Stucki, S. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1999-08-01
A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.
Zwart, Jonathan T L; Jarvis, Matt J
2015-01-01
Measuring radio source counts is critical for characterizing new extragalactic populations, brings a wealth of science within reach and will inform forecasts for SKA and its pathfinders. Yet there is currently great debate (and few measurements) about the behaviour of the 1.4-GHz counts in the microJy regime. One way to push the counts to these levels is via 'stacking', the covariance of a map with a catalogue at higher resolution and (often) a different wavelength. For the first time, we cast stacking in a fully bayesian framework, applying it to (i) the SKADS simulation and (ii) VLA data stacked at the positions of sources from the VIDEO survey. In the former case, the algorithm recovers the counts correctly when applied to the catalogue, but is biased high when confusion comes into play. This needs to be accounted for in the analysis of data from any relatively-low-resolution SKA pathfinders. For the latter case, the observed radio source counts remain flat below the 5-sigma level of 85 microJy as far as 4...
Resonance ionization laser ion sources for on-line isotope separators (invited)
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented
Present status of singly charged ion ECR sources at the SARA on-line separator
Various 2.45 GHz microwave electron cyclotron resonance (ECR) ion-sources designed with quartz tubes and without hexapole have been developed and tested for production, transport and focalization of singly-charged ions. A first on-line endeavour to separate radioactive isotopes in a He-jet coupled mode has been realized with a capillary skimmer ion-source injection system parallel to the source plasma axis. In order to improve the coupling of a ECR source with the He-jet system, a new compact metallic body ion-source with a skimmer-catcher injection arrangement perpendicular to the plasma has been designed. The layout of this new metallic ion-source is given. The ionization efficiencies have been measured as a function of gas pressure for a complete off-line regime with various support gases and for a dynamical regime induced with an He-jet injection simulating the subsequent on-line coupled mode conditions. (orig.)
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG
Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Rey, Valentine; Rey, Christian
2016-01-01
This article deals with the computation of guaranteed lower bounds of the error in the framework of finite element (FE) and domain decomposition (DD) methods. In addition to a fully parallel computation, the proposed lower bounds separate the algebraic error (due to the use of a DD iterative solver) from the discretization error (due to the FE), which enables the steering of the iterative solver by the discretization error. These lower bounds are also used to improve the goal-oriented error estimation in a substructured context. Assessments on 2D static linear mechanic problems illustrate the relevance of the separation of sources of error and the lower bounds' independence from the substructuring. We also steer the iterative solver by an objective of precision on a quantity of interest. This strategy consists in a sequence of solvings and takes advantage of adaptive remeshing and recycling of search directions.
AN EME BLIND SOURCE SEPARATION ALGORITHM BASED ON GENERALIZED EXPONENTIAL FUNCTION
Miao Hao; Li Xiaodong; Tian Jing
2008-01-01
This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid function as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.
Dutta, R.; Wang, T.; Jonsson, S.
2014-12-01
A shallow magnitude 6.6 strike-slip earthquake occurred offshore west of the Fukuoka prefecture, Northern Kyushu Island in Japan in 2005. We use InSAR stable point-target data and GPS to constrain the location and source parameters of the mainshock. We use a uniform slip model on a rectangular dislocation in a homogenous elastic half-space and implement Bayesian estimation to obtain uncertainties for the derived model parameters. The offshore location of the earthquake makes the fault parameter estimation challenging, as the geodetic data only cover the area to the east of the earthquake. The marginal distributions of the source parameters show that the location of the western end of the fault is poorly constrained by the data whereas the eastern end, located closer to the shore, is better resolved. We use Gaussian a priori constraint on the moment magnitude (Mw 6.6) and the location of the fault with respect to the aftershock distribution of the earthquake and find the amount of fault slip to be in the range from 1 m to 1.3m with decreasing probability. We propagate the fault model uncertainties and calculate the variability of Coulomb Failure stress changes for the Kego fault, located directly below Fukuoka city, showing that the mainshock increased stress on the fault and brought it closer to failure, a concern for the Fukuoka city authorities and inhabitants.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Variational Blind Source Separation Toolbox and its Application to Hyperspectral Image Data
Tichý, Ondřej; Šmídl, Václav
Piscataway: IEEE Computer Society, 2015, s. 1336-1340. ISBN 978-0-9928626-4-0. ISSN 2076-1465. [23rd European Signal Processing Conference (EUSIPCO). Nice (FR), 31.08.2015-04.09.2015] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Blind source separation * Variational Bayes method * Sparse prior * Hyperspectral image Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/tichy-0447094.pdf
Blind Source Separation in Farsi Language by Using Hermitian Angle in Convolutive Enviroment
Atefeh Soltani
2013-04-01
Full Text Available This paper presents a T-F masking method for convolutive blind source separation based on hermitian angle concept. The hermitian angle is calculated between T-F domain mixture vector and reference vector. Two different reference vectors are assumed for calculating two different hermitian angles, and then these angles are clustered with k-means or FCM method to estimate unmixing masks. The well-known permutation problem is solved based on k-means clustering of estimated masks which are partitioned to small groups. The experimental results show an improvement in performance when using two different reference vectors compared to only one.
Time-domain Blind Audio Source Separation Using Advanced Component Clustering and Reconstruction
Koldovský, Zbyněk; Tichavský, Petr
Trento: IEEE, 2008, s. 216-219. ISBN 978-1-4244-2337-8; ISBN 978-1-4244-2338-5. [Hands-free Speech Communication and Microphone Arrays 2008. Trento (IT), 06.05.2008-08.05.2008] R&D Projects: GA MŠk 1M0572 Grant ostatní: GA ČR(CZ) GP102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * audio signals Subject RIV: BI - Acoustics
Extension of EFICA Algorithm for Blind Separation of Piecewise Stationary Non-Gaussian Sources
Koldovský, Zbyněk; Málek, J.; Tichavský, Petr; Deville, Y.; Hosseini, S.
Bryan: Conference Management Services, 2008, s. 1913-1916. ISBN 978-1-4244-1483-3; ISBN 1-4244-1484-9. [ICASSP 2008, IEEE International Conference on Acoustics, Speech adn Signal Processing. Las Vegas (US), 30.03.2008-04.04.2008] R&D Projects: GA MŠk 1M0572 Grant ostatní: GA ČR(CZ) GP102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : independent component analysis * piecewise stationary signals * blind source separation Subject RIV: BB - Applied Statistics, Operational Research
Kinetic Modeling of the Dynamic PET Brain Data Using Blind Source Separation Methods
Tichý, Ondřej; Šmídl, Václav
Dalian, China: IEEE press, 2014, s. 244-249. ISBN 978-1-4799-5837-5. [The 2014 7th International Conference on BioMedical Engineering and Informatics. Dalian (CN), 14.10.2014-16.10.2014] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * dynamic PET * input function * deconvolution Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/tichy-0433424.pdf
Generic Uniqueness of a Structured Matrix Factorization and Applications in Blind Source Separation
Domanov, Ignat; Lathauwer, Lieven De
2016-06-01
Algebraic geometry, although little explored in signal processing, provides tools that are very convenient for investigating generic properties in a wide range of applications. Generic properties are properties that hold "almost everywhere". We present a set of conditions that are sufficient for demonstrating the generic uniqueness of a certain structured matrix factorization. This set of conditions may be used as a checklist for generic uniqueness in different settings. We discuss two particular applications in detail. We provide a relaxed generic uniqueness condition for joint matrix diagonalization that is relevant for independent component analysis in the underdetermined case. We present generic uniqueness conditions for a recently proposed class of deterministic blind source separation methods that rely on mild source models. For the interested reader we provide some intuition on how the results are connected to their algebraic geometric roots.
Thibodeau, C; Monette, F; Glaus, M; Laflamme, C B
2011-01-01
The black water and grey water source-separation sanitation system aims at efficient use of energy (biogas), water and nutrients but currently lacks evidence of economic viability to be considered a credible alternative to the conventional system. This study intends to demonstrate economic viability, identify main cost contributors and assess critical influencing factors. A technico-economic model was built based on a new neighbourhood in a Canadian context. Three implementation scales of source-separation system are defined: 500, 5,000 and 50,000 inhabitants. The results show that the source-separation system is 33% to 118% more costly than the conventional system, with the larger cost differential obtained by lower source-separation system implementation scales. A sensitivity analysis demonstrates that vacuum toilet flow reduction from 1.0 to 0.25 L/flush decreases source-separation system cost between 23 and 27%. It also shows that high resource costs can be beneficial or unfavourable to the source-separation system depending on whether the vacuum toilet flow is low or normal. Therefore, the future of this configuration of the source-separation system lies mainly in vacuum toilet flow reduction or the introduction of new efficient effluent volume reduction processes (e.g. reverse osmosis). PMID:22170836
Escolano, Jose; Xiang, Ning; Perez-Lorenzo, Jose M; Cobos, Maximo; Lopez, Jose J
2014-02-01
Sound source localization using a two-microphone array is an active area of research, with considerable potential for use with video conferencing, mobile devices, and robotics. Based on the observed time-differences of arrival between sound signals, a probability distribution of the location of the sources is considered to estimate the actual source positions. However, these algorithms assume a given number of sound sources. This paper describes an updated research account on the solution presented in Escolano et al. [J. Acoust. Am. Soc. 132(3), 1257-1260 (2012)], where nested sampling is used to explore a probability distribution of the source position using a Laplacian mixture model, which allows both the number and position of speech sources to be inferred. This paper presents different experimental setups and scenarios to demonstrate the viability of the proposed method, which is compared with some of the most popular sampling methods, demonstrating that nested sampling is an accurate tool for speech localization. PMID:25234883
Resource recovery from source separated domestic waste(water) streams; full scale results.
Zeeman, Grietje; Kujawa-Roeleveld, Katarzyna
2011-01-01
A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation leading to various recovery products. Separate vacuum collection and transport followed by anaerobic treatment of concentrated black water (BW) demonstrated on a scale of 32 houses preserve 7.6 g/N/p/d and 0.63 gP/p/d amounting to respectively 69 and 48% of the theoretically produced N and P in the household, and 95% of the retained P was shown to be recoverable via struvite precipitation. Reuse of the anaerobic sludge in agriculture can substantially increase the P recovery. Energy recovery in the form of biogas from anaerobic digestion of concentrated BW, fits well in new concepts of sustainable, zero energy buildings. Nutrient recovery from separately collected urine lowers the percentage of nutrient recovery in comparison with BW but can, on the other hand, often be implemented in existing sanitation concepts. Theoretically 11gN/p/d and 1.0 g P/p/d are produced with urine, of which 38-63 and 34-61% were recovered in practice on a scale of 8-160 inhabitants in Sweden. New sanitation concepts with resource recovery and reuse are being demonstrated worldwide and more and more experience is being gained. PMID:22105119
Separation of Radio-Frequency Sources and Localization of Partial Discharges in Noisy Environments
Guillermo Robles
2015-04-01
Full Text Available The detection of partial discharges (PD can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD. Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO to minimize a distance function.
Micropollutant removal in an algal treatment system fed with source separated wastewater streams.
de Wilt, Arnoud; Butkovskyi, Andrii; Tuantet, Kanjana; Leal, Lucia Hernandez; Fernandes, Tânia V; Langenhoff, Alette; Zeeman, Grietje
2016-03-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceuticals (diclofenac, ibuprofen, paracetamol, metoprolol, carbamazepine and trimethoprim). Additionally, incorporation of these pharmaceuticals and three estrogens (estrone, 17β-estradiol and ethinylestradiol) into algal biomass was studied. Biodegradation and photolysis led to 60-100% removal of diclofenac, ibuprofen, paracetamol and metoprolol. Removal of carbamazepine and trimethoprim was incomplete and did not exceed 30% and 60%, respectively. Sorption to algal biomass accounted for less than 20% of the micropollutant removal. Furthermore, the presence of micropollutants did not inhibit C. sorokiniana growth at applied concentrations. Algal treatment systems allow simultaneous removal of micropollutants and recovery of nutrients from source separated wastewater. Nutrient rich algal biomass can be harvested and applied as fertilizer in agriculture, as lower input of micropollutants to soil is achieved when algal biomass is applied as fertilizer instead of urine. PMID:26546707
Carabias-Orti, Julio J.; Cobos, Máximo; Vera-Candeas, Pedro; Rodríguez-Serrano, Francisco J.
2013-12-01
Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge amount of prior information available in this scenario. In fact, besides the special properties of close-microphone tracks, the knowledge on the number and type of instruments making up the mixture can also be successfully exploited for improved separation performance. In this paper, a nonnegative matrix factorization (NMF) method making use of all the above information is proposed. To this end, a set of instrument models are learnt from a training database and incorporated into a multichannel extension of the NMF algorithm. Several options to initialize the algorithm are suggested, exploring their performance in multiple music tracks and comparing the results to other state-of-the-art approaches.
Fate of pharmaceuticals in full-scale source separated sanitation system.
Butkovskyi, A; Hernandez Leal, L; Rijnaarts, H H M; Zeeman, G
2015-11-15
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen-limited autotrophic nitrification-denitrification in a rotating biological contactor and struvite precipitation. Grey water is treated in an aerobic activated sludge process. Concentration of 10 pharmaceuticals and 2 transformation products in black water ranged between low μg/l to low mg/l. Additionally, 5 pharmaceuticals were also present in grey water in low μg/l range. Pharmaceutical influent loads were distributed over two streams, i.e. diclofenac was present for 70% in grey water, while the other compounds were predominantly associated to black water. Removal in the UASB reactor fed with black water exceeded 70% for 9 pharmaceuticals out of the 12 detected, with only two pharmaceuticals removed by sorption to sludge. Ibuprofen and the transformation product of naproxen, desmethylnaproxen, were removed in the rotating biological contactor. In contrast, only paracetamol removal exceeded 90% in the grey water treatment system while removal of other 7 pharmaceuticals was below 40% or even negative. The efficiency of pharmaceutical removal in the source separated sanitation system was compared with removal in the conventional sewage treatment plants. Furthermore, effluent concentrations of black water and grey water treatment systems were compared with predicted no-effect concentrations to assess toxicity of the effluent. Concentrations of diclofenac, ibuprofen and oxazepam in both effluents were higher than predicted no-effect concentrations, indicating the necessity of post-treatment. Ciprofloxacin, metoprolol and propranolol were found in UASB sludge in μg/g range, while pharmaceutical concentrations in struvite did not exceed the detection limits. PMID:26364222
Valeriy Bekmuradov
2014-10-01
Full Text Available Production of biofuel such as ethanol from lignocellulosic biomass is a beneficial way to meet sustainability and energy security in the future. The main challenge in bioethanol conversion is the high cost of processing, in which enzymatic hydrolysis and fermentation are the major steps. Among the strategies to lower processing costs are utilizing both glucose and xylose sugars present in biomass for conversion. An approach featuring enzymatic hydrolysis and fermentation steps, identified as separate hydrolysis and fermentation (SHF was used in this work. Proposed solution is to use "pre-processing" technologies, including the thermal screw press (TSP and cellulose-organic-solvent based lignocellulose fractionation (COSLIF pretreatments. Such treatments were conducted on a widely available feedstock such as source separated organic waste (SSO to liberate all sugars to be used in the fermentation process. Enzymatic hydrolysis was featured with addition of commercial available enzyme, Accellerase 1500, to mediate enzymatic hydrolysis process. On average, the sugar yield from the TSP and COSLIF pretreatments followed by enzymatic hydrolysis was remarkable at 90%. In this work, evaluation of the SSO hydrolysate obtained from COSLIF and enzymatic hydrolysis pretreaments on ethanol yields was compared by fermentation results with two different recombinant strains: Zymomonas mobilis 8b and Saccharomyces cerevisiae DA2416. At 48 hours of fermentation, ethanol yield was equivalent to 0.48g of ethanol produced per gram of SSO biomass by Z.mobilis 8b and 0.50g of ethanol produced per gram of SSO biomass by S. cerevisiae DA2416. This study provides important insights for investigation of the source-separated organic (SSO waste on ethanol production by different strains and becomes a useful tool to facilitate future process optimization for pilot scale facilities.