Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin
2013-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...
Low Complexity Bayesian Single Channel Source Separation
DEFF Research Database (Denmark)
Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole
We propose a simple Bayesian model for performing single channel speech separation using factorized source priors in a sliding window linearly transformed domain. Using a one dimensional mixture of Gaussians to model each band source leads to fast tractable inference for the source signals....... Simulations with separation of a male and a female speaker using priors trained on the same speakers show comparable performance with the blind separation approach of G.-J. Jang and T.-W. Lee (see NIPS, vol.15, 2003) with a SNR improvement of 4.9 dB for both the male and female speaker. Mixing coefficients...... keeping the complexity low using machine learning and CASA (computational auditory scene analysis) approaches (Jang and Lee, 2003; Roweis, S.T., 2001; Wang, D.L. and Brown, G.J., 1999; Hu, G. and Wang, D., 2003)....
Sparsity in Bayesian Blind Source Separation and Deconvolution
Czech Academy of Sciences Publication Activity Database
Šmídl, Václav; Tichý, Ondřej
Berlin Heidelberg: Springer, 2013, s. 548-563. (Lecture Notes in Computer Science. vol. 8189. part II). ISBN 978-3-642-40990-5. ISSN 0302-9743. [The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD 2013). Praha (CZ), 24.09.2013-26.09.2013] R&D Projects: GA ČR GA13-29225S Keywords : Blind Source Separation * Deconvolution * Sparsity * Scintigraphy Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/AS/tichy-sparsity in bayesian blind source separation and deconvolution.pdf
Bayesian Blind Source Separation of Positive Non Stationary Sources
Ichir, Mahieddine M.; Mohammad-Djafari, Ali
2004-11-01
In this contribution, we address the problem of blind non negative source separation. This problem finds its application in many fields of data analysis. We propose herein a novel approach based on Gamma mixture probability priors: Gamma densities to constraint the unobserved sources to lie on the positive half plane; a mixture density with a first order Markov model on the associated hidden variables to account for eventual non stationarity on the sources. Posterior mean estimates are obtained via appropriate Monte Carlo Markov Chain sampling.
Reconstruction of Zeff profiles at TEXTOR through Bayesian source separation
International Nuclear Information System (INIS)
We describe a work in progress on the reconstruction of radial profiles for the ion effective charge Zeff on the TEXTOR tokamak, using statistical data analysis techniques. We introduce our diagnostic for the measurement of Bremsstrahlung emissivity signals. Zeff profiles can be determined by Abel inversion of line-integrated measurements of the Bremsstrahlung emissivity (εff) from the plasma and the plasma electron density (ne) and temperature (Te). However, at the plasma edge only estimated values are routinely used for ne and Te, which are moreover determined at different toroidal locations. These various uncertainties hinder the interpretation of a Zeff profile outside the central plasma. In order to circumvent this problem, we propose several scenarios meant to allow the extraction by (Bayesian) Blind Source Separation techniques of either (line-integrated) Zeff wave shapes or absolutely calibrated signals from (line-integrated) emissivity signals, using also density and temperature signals, as required. (authors)
Bayesian Blind Source Separation with Unknown Prior Covariance
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
Cham : Springer, 2015 - (Vincent, E.; Yeredor, A.; Koldovský, Z.; Tichavský, P.), s. 352-359 ISBN 978-3-319-22481-7. ISSN 0302-9743. - (Lecture Notes in Computer Science. 9237). [12th International Conference on Latent Variable Analysis and Signal Separation. Liberec (CZ), 25.08.2015-28.08.2015] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Blind source separation * Covariance model * Variational Bayes approximation * Non-negative matrix factorization Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/tichy-0447092.pdf
Bayesian Source Separation Applied to Identifying Complex Organic Molecules in Space
Knuth, Kevin H; Choinsky, Joshua; Maunu, Haley A; Carbon, Duane F
2014-01-01
Emission from a class of benzene-based molecules known as Polycyclic Aromatic Hydrocarbons (PAHs) dominates the infrared spectrum of star-forming regions. The observed emission appears to arise from the combined emission of numerous PAH species, each with its unique spectrum. Linear superposition of the PAH spectra identifies this problem as a source separation problem. It is, however, of a formidable class of source separation problems given that different PAH sources potentially number in the hundreds, even thousands, and there is only one measured spectral signal for a given astrophysical site. Fortunately, the source spectra of the PAHs are known, but the signal is also contaminated by other spectral sources. We describe our ongoing work in developing Bayesian source separation techniques relying on nested sampling in conjunction with an ON/OFF mechanism enabling simultaneous estimation of the probability that a particular PAH species is present and its contribution to the spectrum.
Convergent Bayesian formulations of blind source separation and electromagnetic source estimation
Knuth, Kevin H.; Vaughan Jr, Herbert G.
2015-01-01
We consider two areas of research that have been developing in parallel over the last decade: blind source separation (BSS) and electromagnetic source estimation (ESE). BSS deals with the recovery of source signals when only mixtures of signals can be obtained from an array of detectors and the only prior knowledge consists of some information about the nature of the source signals. On the other hand, ESE utilizes knowledge of the electromagnetic forward problem to assign source signals to th...
On Sparsity in Bayesian Blind Source Separation for Dynamic Medical Imaging
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej
Praha : Katedra metematiky, FSv ČVUT, 2014, s. 20-21. [Rektorysova Soutěž. Praha (CZ), 3.12.2014] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * dynamic medical imaging * sparsity constraint Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/tichy-0436843.pdf
Estimation of Input Function from Dynamic PET Brain Data Using Bayesian Blind Source Separation
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
2015-01-01
Roč. 12, č. 4 (2015), s. 1273-1287. ISSN 1820-0214 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * Variational Bayes method * dynamic PET * input function * deconvolution Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.477, year: 2014 http://library.utia.cas.cz/separaty/2015/AS/tichy-0450509.pdf
Bayesian kinematic earthquake source models
Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.
2009-12-01
Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high
Single channel signal component separation using Bayesian estimation
Institute of Scientific and Technical Information of China (English)
Cai Quanwei; Wei Ping; Xiao Xianci
2007-01-01
A Bayesian estimation method to separate multicomponent signals with single channel observation is presented in this paper. By using the basis function projection, the component separation becomes a problem of limited parameter estimation. Then, a Bayesian model for estimating parameters is set up. The reversible jump MCMC (Monte Carlo Markov Chain) algorithmis adopted to perform the Bayesian computation. The method can jointly estimate the parameters of each component and the component number. Simulation results demonstrate that the method has low SNR threshold and better performance.
Informed source separation: source coding meets source separation
Ozerov, Alexey; Liutkus, Antoine; Badeau, Roland; Richard, Gaël
2011-01-01
We consider the informed source separation (ISS) problem where, given the sources and the mixtures, any kind of side-information can be computed during a so-called encoding stage. This side-information is then used to assist source separation, given the mixtures only, at the so-called decoding stage. State of the art ISS approaches do not really consider ISS as a coding problem and rely on some purely source separation-inspired strategies, leading to performances that can at best reach those ...
A Bayesian approach to earthquake source studies
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also
Bayesian Kinematic Finite Fault Source Models (Invited)
Minson, S. E.; Simons, M.; Beck, J. L.
2010-12-01
Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.
A Bayesian method for microseismic source inversion
Pugh, D. J.; White, R. S.; Christie, P. A. F.
2016-08-01
Earthquake source inversion is highly dependent on location determination and velocity models. Uncertainties in both the model parameters and the observations need to be rigorously incorporated into an inversion approach. Here, we show a probabilistic Bayesian method that allows formal inclusion of the uncertainties in the moment tensor inversion. This method allows the combination of different sets of far-field observations, such as P-wave and S-wave polarities and amplitude ratios, into one inversion. Additional observations can be included by deriving a suitable likelihood function from the uncertainties. This inversion produces samples from the source posterior probability distribution, including a best-fitting solution for the source mechanism and associated probability. The inversion can be constrained to the double-couple space or allowed to explore the gamut of moment tensor solutions, allowing volumetric and other non-double-couple components. The posterior probability of the double-couple and full moment tensor source models can be evaluated from the Bayesian evidence, using samples from the likelihood distributions for the two source models, producing an estimate of whether or not a source is double-couple. Such an approach is ideally suited to microseismic studies where there are many sources of uncertainty and it is often difficult to produce reliability estimates of the source mechanism, although this can be true of many other cases. Using full-waveform synthetic seismograms, we also show the effects of noise, location, network distribution and velocity model uncertainty on the source probability density function. The noise has the largest effect on the results, especially as it can affect other parts of the event processing. This uncertainty can lead to erroneous non-double-couple source probability distributions, even when no other uncertainties exist. Although including amplitude ratios can improve the constraint on the source probability
Bayesian Separation of Non-Stationary Mixtures of Dependent Gaus
National Aeronautics and Space Administration — In this work, we propose a novel approach to perform Dependent Component Analysis (DCA). DCA can be thought as the separation of latent, dependent sources from...
Convolutive Blind Source Separation Methods
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik;
2008-01-01
During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources from...
Le Cam, Steven; Caune, Vairis; Ranta, Radu; Korats, Gundars; Louis-Dorr, Valerie
2015-08-01
The brain source localization problem has been extensively studied in the past years, yielding a large panel of methodologies, each bringing their own strengths and weaknesses. Combining several of these approaches might help in enhancing their respective performance. Our study is carried out in the particular context of intracranial recordings, with the objective to explain the measurements based on a reduced number of dipolar activities. We take benefit of the sparse nature of the Bayesian approaches to separate the noise from the source space, and to distinguish between several source contributions on the electrodes. This first step provides accurate estimates of the dipole projections, which can be used as an entry to an equivalent current dipole fitting procedure. We demonstrate on simulations that the localization results are significantly enhanced by this post-processing step when up to five dipoles are activated simultaneously. PMID:26736344
A localization model to localize multiple sources using Bayesian inference
Dunham, Joshua Rolv
Accurate localization of a sound source in a room setting is important in both psychoacoustics and architectural acoustics. Binaural models have been proposed to explain how the brain processes and utilizes the interaural time differences (ITDs) and interaural level differences (ILDs) of sound waves arriving at the ears of a listener in determining source location. Recent work shows that applying Bayesian methods to this problem is proving fruitful. In this thesis, pink noise samples are convolved with head-related transfer functions (HRTFs) and compared to combinations of one and two anechoic speech signals convolved with different HRTFs or binaural room impulse responses (BRIRs) to simulate room positions. Through exhaustive calculation of Bayesian posterior probabilities and using a maximal likelihood approach, model selection will determine the number of sources present, and parameter estimation will result in azimuthal direction of the source(s).
Source separation as an exercise in logical induction
Knuth, Kevin H.
2002-01-01
We examine the relationship between the Bayesian and information-theoretic formulations of source separation algorithms. This work makes use of the relationship between the work of Claude E. Shannon and the "Recent Contributions" by Warren Weaver (Shannon & Weaver 1949) as clarified by Richard T. Cox (1979) and expounded upon by Robert L. Fry (1996) as a duality between a logic of assertions and a logic of questions. Working with the logic of assertions requires the use of probability as a me...
Compressing Data by Source Separation
Schmidt, A.; Tréguier, E.; Schmidt, F.; Moussaoui, S.
2012-04-01
We interpret source separation of hyperspectral data as a way of applying lossy compressing. In settings where datacubes can be interpreted as a linear combination of source spectra and their abundances and the number of sources is small, we try to quantify the trade-offs and the benefits of source separation and its implementation with non-negative source factorisation. While various methods to implement non-negative matrix factorisation have been used successfully for factoring hyperspectral images into physically meaningful sources which linearly combine to an approximation of the original image. This is useful for modelling the processes which make up the image. At the same time, the approximation opens up the potential for a significant reduction of the data by keeping only the sources and their corresponding abundances, instead of the original complete data cube. This presentation will try to explore the potential of the idea and also to establish limits of its use. Formally, the setting is as follows: we consider P pixels of a hyperspectral image which are acquired at L frequency bands and which are represented as a PxL data matrix X. Each row of this matrix represents a spectrum at a pixel with spatial index p=1..P; this implies that the original topology may be disregarded. Since we work under the assumption of linear mixing, the p-th spectrum, 1<=p<=P, can be expressed as a linear combination of r, 1<=r<=R, source spectra. Thus, X=AxS+E, E being an error matrix to be minimised, and X, A, and S only have non-negative entries. The rows of matrix S are the estimations of the R source spectra, and each entry of A expresses the contribution of the r-th component to the pixel with spatial index p. There are applications where we may interpret the rows of S as physical sources which can be combined using the columns of A to approximate the original data. If the source signals are few and strong (but not even necessarily meaningful), the data volume that has to
A Bayesian analysis of regularised source inversions in gravitational lensing
Suyu, S H; Hobson, M P; Marshall, P J
2006-01-01
Strong gravitational lens systems with extended sources are of special interest because they provide additional constraints on the models of the lens systems. To use a gravitational lens system for measuring the Hubble constant, one would need to determine the lens potential and the source intensity distribution simultaneously. A linear inversion method to reconstruct a pixellated source distribution of a given lens potential model was introduced by Warren and Dye. In the inversion process, a regularisation on the source intensity is often needed to ensure a successful inversion with a faithful resulting source. In this paper, we use Bayesian analysis to determine the optimal regularisation constant (strength of regularisation) of a given form of regularisation and to objectively choose the optimal form of regularisation given a selection of regularisations. We consider and compare quantitatively three different forms of regularisation previously described in the literature for source inversions in gravitatio...
A Bayesian Approach to Detection of Small Low Emission Sources
Xun, Xiaolei; Carroll, Raymond J; Kuchment, Peter
2011-01-01
The article addresses the problem of detecting presence and location of a small low emission source inside of an object, when the background noise dominates. This problem arises, for instance, in some homeland security applications. The goal is to reach the signal-to-noise ratio (SNR) levels on the order of $10^{-3}$. A Bayesian approach to this problem is implemented in 2D. The method allows inference not only about the existence of the source, but also about its location. We derive Bayes factors for model selection and estimation of location based on Markov Chain Monte Carlo (MCMC) simulation. A simulation study shows that with sufficiently high total emission level, our method can effectively locate the source.
Dirichlet Methods for Bayesian Source Detection in Radio Astronomy Images
Friedlander, A. M.
2014-02-01
The sheer volume of data to be produced by the next generation of radio telescopes - exabytes of data on hundreds of millions of objects - makes automated methods for the detection of astronomical objects ("sources") essential. Of particular importance are low surface brightness objects, which are not well found by current automated methods. This thesis explores Bayesian methods for source detection that use Dirichlet or multinomial models for pixel intensity distributions in discretised radio astronomy images. A novel image discretisation method that incorporates uncertainty about how the image should be discretised is developed. Latent Dirichlet allocation - a method originally developed for inferring latent topics in document collections - is used to estimate source and background distributions in radio astronomy images. A new Dirichlet-multinomial ratio, indicating how well a region conforms to a well-specified model of background versus a loosely-specified model of foreground, is derived. Finally, latent Dirichlet allocation and the Dirichlet-multinomial ratio are combined for source detection in astronomical images. The methods developed in this thesis perform source detection well in comparison to two widely-used source detection packages and, importantly, find dim sources not well found by other algorithms.
Nitrate source apportionment in a subtropical watershed using Bayesian model
Energy Technology Data Exchange (ETDEWEB)
Yang, Liping; Han, Jiangpei; Xue, Jianlong; Zeng, Lingzao [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Shi, Jiachun, E-mail: jcshi@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Wu, Laosheng, E-mail: laowu@zju.edu.cn [College of Environmental and Natural Resource Sciences, Zhejiang Provincial Key Laboratory of Subtropical Soil and Plant Nutrition, Zhejiang University, Hangzhou, 310058 (China); Jiang, Yonghai [State Key Laboratory of Environmental Criteria and Risk Assessment, Chinese Research Academy of Environmental Sciences, Beijing, 100012 (China)
2013-10-01
Nitrate (NO{sub 3}{sup −}) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO{sub 3}{sup −} concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L{sup −1}) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L{sup −1}). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L{sup −1} NO{sub 3}{sup −}. Four sources of NO{sub 3}{sup −} (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl{sup −}, NO{sub 3}{sup −}, HCO{sub 3}{sup −}, SO{sub 4}{sup 2−}, Ca{sup 2+}, K{sup +}, Mg{sup 2+}, Na{sup +}, dissolved oxygen (DO)] and dual isotope approach (δ{sup 15}N–NO{sub 3}{sup −} and δ{sup 18}O–NO{sub 3}{sup −}). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO{sub 3}{sup −} to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO{sub 3}{sup −}, better
Nitrate source apportionment in a subtropical watershed using Bayesian model
International Nuclear Information System (INIS)
Nitrate (NO3−) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO3− concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L−1) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L−1). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L−1 NO3−. Four sources of NO3− (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl−, NO3−, HCO3−, SO42−, Ca2+, K+, Mg2+, Na+, dissolved oxygen (DO)] and dual isotope approach (δ15N–NO3− and δ18O–NO3−). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO3− to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO3−, better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds. - Highlights: • Nitrate concentration in water displayed
Bayesian Source Attribution of Salmonellosis in South Australia.
Glass, K; Fearnley, E; Hocking, H; Raupach, J; Veitch, M; Ford, L; Kirk, M D
2016-03-01
Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia. PMID:26133008
Fast Bayesian optimal experimental design for seismic source inversion
Long, Quan
2015-07-01
We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L
Blind source separation dependent component analysis
Xiang, Yong; Yang, Zuyuan
2015-01-01
This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.
Improved Bayesian Infrasonic Source Localization for regional infrasound
Blom, Philip S.; Marcillo, Omar; Arrowsmith, Stephen J.
2015-12-01
The mathematical framework used in the Bayesian Infrasonic Source Localization (BISL) methodology is examined and simplified providing a generalized method of estimating the source location and time for an infrasonic event. The likelihood function describing an infrasonic detection used in BISL has been redefined to include the von Mises distribution developed in directional statistics and propagation-based, physically derived celerity-range and azimuth deviation models. Frameworks for constructing propagation-based celerity-range and azimuth deviation statistics are presented to demonstrate how stochastic propagation modelling methods can be used to improve the precision and accuracy of the posterior probability density function describing the source localization. Infrasonic signals recorded at a number of arrays in the western United States produced by rocket motor detonations at the Utah Test and Training Range are used to demonstrate the application of the new mathematical framework and to quantify the improvement obtained by using the stochastic propagation modelling methods. Using propagation-based priors, the spatial and temporal confidence bounds of the source decreased by more than 40 per cent in all cases and by as much as 80 per cent in one case. Further, the accuracy of the estimates remained high, keeping the ground truth within the 99 per cent confidence bounds for all cases.
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Parameter inference is a fundamental problem in data-driven modeling. Given observed data that is believed to be a realization of some parameterized model, the aim is to find parameter values that are able to explain the observed data. In many situations, the dominant sources of uncertainty must be included into the model for making reliable predictions. This naturally leads to stochastic models. Stochastic models render parameter inference much harder, as the aim then is to find a distribution of likely parameter values. In Bayesian statistics, which is a consistent framework for data-driven learning, this so-called posterior distribution can be used to make probabilistic predictions. We propose a novel, exact, and very efficient approach for generating posterior parameter distributions for stochastic differential equation models calibrated to measured time series. The algorithm is inspired by reinterpreting the posterior distribution as a statistical mechanics partition function of an object akin to a polymer, where the measurements are mapped on heavier beads compared to those of the simulated data. To arrive at distribution samples, we employ a Hamiltonian Monte Carlo approach combined with a multiple time-scale integration. A separation of time scales naturally arises if either the number of measurement points or the number of simulation points becomes large. Furthermore, at least for one-dimensional problems, we can decouple the harmonic modes between measurement points and solve the fastest part of their dynamics analytically. Our approach is applicable to a wide range of inference problems and is highly parallelizable.
Zhang, Le; Karakci, Ata; Korotkov, Andrei; Sutter, P M; Timbie, Peter T; Tucker, Gregory S; Wandelt, Benjamin D
2016-01-01
We present in this paper a new Bayesian semi-blind approach for foreground removal in observations of the 21-cm signal with interferometers. The technique, which we call HIEMICA (HI Expectation-Maximization Independent Component Analysis), is an extension of the Independent Component Analysis (ICA) technique developed for two-dimensional (2D) CMB maps to three-dimensional (3D) 21-cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from signal based on the diversity of their power spectra. Only relying on the statistical independence of the components, this approach can jointly estimate the 3D power spectrum of the 21-cm signal and, the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21-cm intensity mapping observations. Based on ...
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and th
Audio Source Separation Using a Deep Autoencoder
Jang, Giljin; Kim, Han-Gyu; Oh, Yung-Hwan
2014-01-01
This paper proposes a novel framework for unsupervised audio source separation using a deep autoencoder. The characteristics of unknown source signals mixed in the mixed input is automatically by properly configured autoencoders implemented by a network with many layers, and separated by clustering the coefficient vectors in the code layer. By investigating the weight vectors to the final target, representation layer, the primitive components of the audio signals in the frequency domain are o...
Blind Source Separation Using Hessian Evaluation
Jyothirmayi M; Elavaar Kuzhali S; Sethu Selvi S
2012-01-01
This paper focuses on the blind image separation using sparse representation for natural images. The statistics of the natural image is based on one particular statistical property called sparseness, which is closely related to the super-gaussian distribution. Since natural images can have both gaussian and non gaussian distribution, the original infomax algorithm cannot be directly used for source separation as it is better suited to estimate the super-gaussian sources. Hence, we explore the...
Transform domain steganography with blind source separation
Jouny, Ismail
2015-05-01
This paper applies blind source separation or independent component analysis for images that may contain mixtures of text, audio, or other images for steganography purposes. The paper focuses on separating mixtures in the transform domain such as Fourier domain or the Wavelet domain. The study addresses the effectiveness of steganography when using linear mixtures of multimedia components and the ability of standard blind sources separation techniques to discern hidden multimedia messages. Mixing in the space, frequency, and wavelet (scale) domains is compared. Effectiveness is measured using mean square error rate between original and recovered images.
Removal of micropollutants in source separated sanitation
Butkovskyi, A.
2015-01-01
Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and the rest of the domestic wastewater (grey water). Different characteristics of wastewater streams facilitate recovery of energy, nutrients and fresh water. To ensure agricultural or ecological reuse ...
Rigid Structure from Motion from a Blind Source Separation Perspective
Fortuna, Jeff
2013-01-01
We present an information theoretic approach to define the problem of structure from motion (SfM) as a blind source separation one. Given that for almost all practical joint densities of shape points, the marginal densities are non-Gaussian, we show how higher-order statistics can be used to provide improvements in shape estimates over the methods of factorization via Singular Value Decomposition (SVD), bundle adjustment and Bayesian approaches. Previous techniques have either explicitly or implicitly used only second-order statistics in models of shape or noise. A further advantage of viewing SfM as a blind source problem is that it easily allows for the inclusion of noise and shape models, resulting in Maximum Likelihood (ML) or Maximum a Posteriori (MAP) shape and motion estimates. A key result is that the blind source separation approach has the ability to recover the motion and shape matrices without the need to explicitly know the motion or shape pdf. We demonstrate that it suffices to know whether the pdf is sub-or super-Gaussian (i.e., semi-parametric estimation) and derive a simple formulation to determine this from the data. We provide extensive experimental results on synthetic and real tracked points in order to quantify the improvement obtained from this technique. PMID:23682206
Blind source separation theory and applications
Yu, Xianchuan; Xu, Jindong
2013-01-01
A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies The book presents an overview of Blind Source Separation, a relatively new signal processing method. Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers
Predicting cytotoxicity from heterogeneous data sources with Bayesian learning
Directory of Open Access Journals (Sweden)
Langdon Sarah R
2010-12-01
Full Text Available Abstract Background We collected data from over 80 different cytotoxicity assays from Pfizer in-house work as well as from public sources and investigated the feasibility of using these datasets, which come from a variety of assay formats (having for instance different measured endpoints, incubation times and cell types to derive a general cytotoxicity model. Our main aim was to derive a computational model based on this data that can highlight potentially cytotoxic series early in the drug discovery process. Results We developed Bayesian models for each assay using Scitegic FCFP_6 fingerprints together with the default physical property descriptors. Pairs of assays that are mutually predictive were identified by calculating the ROC score of the model derived from one predicting the experimental outcome of the other, and vice versa. The prediction pairs were visualised in a network where nodes are assays and edges are drawn for ROC scores >0.60 in both directions. We observed that, if assay pairs (A, B and (B, C were mutually predictive, this was often not the case for the pair (A, C. The results from 48 assays connected to each other were merged in one training set of 145590 compounds and a general cytotoxicity model was derived. The model has been cross-validated as well as being validated with a set of 89 FDA approved drug compounds. Conclusions We have generated a predictive model for general cytotoxicity which could speed up the drug discovery process in multiple ways. Firstly, this analysis has shown that the outcomes of different assay formats can be mutually predictive, thus removing the need to submit a potentially toxic compound to multiple assays. Furthermore, this analysis enables selection of (a the easiest-to-run assay as corporate standard, or (b the most descriptive panel of assays by including assays whose outcomes are not mutually predictive. The model is no replacement for a cytotoxicity assay but opens the opportunity to be
Blind Source Separation: the Sparsity Revolution
Bobin, J.; Starck, Jean-Luc; Moudden, Y.; Fadili, Jalal M.
2008-01-01
Over the last few years, the development of multi-channel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-called blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have ...
Grading learning for blind source separation
Institute of Scientific and Technical Information of China (English)
张贤达; 朱孝龙; 保铮
2003-01-01
By generalizing the learning rate parameter to a learning rate matrix, this paper proposes agrading learning algorithm for blind source separation. The whole learning process is divided into threestages: initial stage, capturing stage and tracking stage. In different stages, different learning rates areused for each output component, which is determined by its dependency on other output components. Itis shown that the grading learning algorithm is equivariant and can keep the separating matrix from be-coming singular. Simulations show that the proposed algorithm can achieve faster convergence, bettersteady-state performance and higher numerical robustness, as compared with the existing algorithmsusing fixed, time-descending and adaptive learning rates.
Blind source separation problem in GPS time series
Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.
2016-04-01
A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition
Validi, AbdoulAhad
2013-01-01
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector valued sep...
Blind Source Separation for Speaker Recognition Systems
Unverdorben, Michael; Rothbucher, Martin; Diepold, Klaus
2014-01-01
In this thesis, a combined blind source separation (BSS) and speaker recognition approach for teleconferences is studied. By using a microphone array, consisting of eight microphones, different methods to perform overdetermined independent vector analysis (IVA) are compared. One method is to select a subset of microphones or all available microphones to perform IVA. The second method, the so called subspace method, that utilizes a principal component analysis (PCA) for dimensionality reductio...
Directory of Open Access Journals (Sweden)
Zhujie Chu
2016-02-01
Full Text Available Municipal household solid waste (MHSW has become a serious problem in China over the course of the last two decades, resulting in significant side effects to the environment. Therefore, effective management of MHSW has attracted wide attention from both researchers and practitioners. Separate collection, the first and crucial step to solve the MHSW problem, however, has not been thoroughly studied to date. An empirical survey has been conducted among 387 households in Harbin, China in this study. We use Bayesian Belief Networks model to determine the influencing factors on separate collection. Four types of factors are identified, including political, economic, social cultural and technological based on the PEST (political, economic, social and technological analytical method. In addition, we further analyze the influential power of different factors, based on the network structure and probability changes obtained by Netica software. Results indicate that technological dimension has the greatest impact on MHSW separate collection, followed by the political dimension and economic dimension; social cultural dimension impacts MHSW the least.
International Nuclear Information System (INIS)
This study introduces a non-intrusive approach in the context of low-rank separated representation to construct a surrogate of high-dimensional stochastic functions, e.g., PDEs/ODEs, in order to decrease the computational cost of Markov Chain Monte Carlo simulations in Bayesian inference. The surrogate model is constructed via a regularized alternative least-square regression with Tikhonov regularization using a roughening matrix computing the gradient of the solution, in conjunction with a perturbation-based error indicator to detect optimal model complexities. The model approximates a vector of a continuous solution at discrete values of a physical variable. The required number of random realizations to achieve a successful approximation linearly depends on the function dimensionality. The computational cost of the model construction is quadratic in the number of random inputs, which potentially tackles the curse of dimensionality in high-dimensional stochastic functions. Furthermore, this vector-valued separated representation-based model, in comparison to the available scalar-valued case, leads to a significant reduction in the cost of approximation by an order of magnitude equal to the vector size. The performance of the method is studied through its application to three numerical examples including a 41-dimensional elliptic PDE and a 21-dimensional cavity flow
International Nuclear Information System (INIS)
We present, in this paper, a new unsupervised method for joint image super-resolution and separation between smooth and point sources. For this purpose, we propose a Bayesian approach with a Markovian model for the smooth part and Student’s t-distribution for point sources. All model and noise parameters are considered unknown and should be estimated jointly with images. However, joint estimators (joint MAP or posterior mean) are intractable and an approximation is needed. Therefore, a new gradient-like variational Bayesian method is applied to approximate the true posterior by a free-form separable distribution. A parametric form is obtained by approximating marginals but with form parameters that are mutually dependent. Their optimal values are achieved by iterating them till convergence. The method was tested by the model-generated data and a real dataset from the Herschel space observatory. (paper)
Blind Source Separation For Ion Mobility Spectra
International Nuclear Information System (INIS)
Miniaturization is a powerful trend for smart chemical instrumentation in a diversity of applications. It is know that miniaturization in IMS leads to a degradation of the system characteristics. For the present work, we are interested in signal processing solutions to mitigate limitations introduced by limited drift tube length that basically involve a loss of chemical selectivity. While blind source separation techniques (BSS) are popular in other domains, their application for smart chemical instrumentation is limited. However, in some conditions, basically linearity, BSS may fully recover the concentration time evolution and the pure spectra with few underlying hypothesis. This is extremely helpful in conditions where non-expected chemical interferents may appear, or unwanted perturbations may pollute the spectra. SIMPLISMA has been advocated by Harrington et al. in several papers. However, more modern methods of BSS for bilinear decomposition with the restriction of positiveness have appeared in the last decade. In order to explore and compare the performances of those methods a series of experiments were performed.
Bayesian Blind Separation and Deconvolution of Dynamic Image Sequences Using Sparsity Priors
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
2015-01-01
Roč. 34, č. 1 (2015), s. 258-266. ISSN 0278-0062 R&D Projects: GA ČR GA13-29225S Keywords : Functional imaging * Blind source separation * Computer-aided detection and diagnosis * Probabilistic and statistical methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.390, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/tichy-0431090.pdf
Characterizing the Aperiodic Variability of 3XMM Sources using Bayesian Blocks
Salvetti, D.; De Luca, A.; Belfiore, A.; Marelli, M.
2016-06-01
I will present Bayesian blocks algorithm and its application to XMM sources, statistical properties of the entire 3XMM sample, and a few interesting cases. While XMM-Newton is the best suited instrument for the characterization of X-ray source variability, its most recent catalogue (3XMM) reports light curves only for the brightest ones and excludes from its analysis periods of background flares. One aim of the EXTraS ("Exploring the X-ray Transient and variable Sky") project is the characterization of aperiodic variability of as many 3XMM sources as possible on a time scale shorter than the XMM observation. We adapted the original Bayesian blocks algorithm to account for background contamination, including soft proton flares. In addition, we characterized the short-term aperiodic variability performing a number of statistical tests on all the Bayesian blocks light curves. The EXTraS catalogue and products will be released to the community in 2017, together with tools that will allow the user to replicate EXTraS results and extend them through the next decade.
Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.
2015-09-01
In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.
Blind source separation using second-order cyclostationary statistics
Abed-Meraim, Karim; Xiang, Yong; Manton, Jonathan H.; Hua, Yingbo
2001-01-01
This paper studies the blind source separation (BSS) problem with the assumption that the source signals are cyclostationary. Identifiability and separability criteria based on second-order cyclostationary statistics (SOCS) alone are derived. The identifiability condition is used to define an appropriate contrast function. An iterative algorithm (ATH2) is derived to minimize this contrast function. This algorithm separates the sou...
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion
Zanini, Andrea; Woodbury, Allan D.
2016-02-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
International Nuclear Information System (INIS)
Occurrence of hazardous accident in nuclear power plants and industrial units usually lead to release of radioactive materials and pollutants in environment. These materials and pollutants can be transported to a far downstream by the wind flow. In this paper, we implemented an atmospheric dispersion code to solve the inverse problem. Having received and detected the pollutants in one region, we may estimate the rate and location of the unknown source. For the modeling, one needs a model with ability of atmospheric dispersion calculation. Furthermore, it is required to implement a mathematical approach to infer the source location and the related rates. In this paper the AERMOD software and Bayesian inference along the Markov Chain Monte Carlo have been applied. Implementing, Bayesian approach and Markov Chain Monte Carlo for the aforementioned subject is not a new approach, but the AERMOD model coupled with the said methods is a new and well known regulatory software, and enhances the reliability of outcomes. To evaluate the method, an example is considered by defining pollutants concentration in a specific region and then obtaining the source location and intensity by a direct calculation. The result of the calculation estimates the average source location at a distance of 7km with an accuracy of 5m which is good enough to support the ability of the proposed algorithm.
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.; McCall, Jonathon D.; Prinke, Amanda M.; Webster, Jennifer B.; Seifert, Carolyn E.
2015-06-01
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search.
Energy Technology Data Exchange (ETDEWEB)
Miller, Erin A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Robinson, Sean M. [Pacific Northwest National Lab. (PNNL), Seattle, WA (United States); Anderson, Kevin K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McCall, Jonathon D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Prinke, Amanda M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Webster, Jennifer B. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Seifert, Carolyn E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
International Nuclear Information System (INIS)
We present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. This technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry of response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
DEFF Research Database (Denmark)
Hald, Tine; Vose, D.; Wegener, Henrik Caspar;
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused...... by different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting...
A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis
DEFF Research Database (Denmark)
Hald, Tine; Vose, D.; Wegener, Henrik Caspar; Koupeev, T.
2004-01-01
Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused by...... different Salmonella sero and phage types as a function of the prevalence of these Salmonella types in the animal-food sources and the amount of food source consumed. A multiparameter prior accounting for the presumed but unknown differences between serotypes and food sources with respect to causing human...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting...
Application of evidence theory in information fusion of multiple sources in bayesian analysis
Institute of Scientific and Technical Information of China (English)
周忠宝; 蒋平; 武小悦
2004-01-01
How to obtain proper prior distribution is one of the most critical problems in Bayesian analysis. In many practical cases, the prior information often comes from different sources, and the prior distribution form could be easily known in some certain way while the parameters are hard to determine. In this paper, based on the evidence theory, a new method is presented to fuse the information of multiple sources and determine the parameters of the prior distribution when the form is known. By taking the prior distributions which result from the information of multiple sources and converting them into corresponding mass functions which can be combined by Dempster-Shafer (D-S) method, we get the combined mass function and the representative points of the prior distribution. These points are used to fit with the given distribution form to determine the parameters of the prior distrbution. And then the fused prior distribution is obtained and Bayesian analysis can be performed.How to convert the prior distributions into mass functions properly and get the representative points of the fused prior distribution is the central question we address in this paper. The simulation example shows that the proposed method is effective.
Magnetic source separation in Earth's outer core.
Hoffman, Kenneth A; Singer, Brad S
2008-09-26
We present evidence that the source of Earth's axial dipole field is largely independent from the sources responsible for the rest of the geomagnetic field, the so-called nonaxial dipole (NAD) field. Support for this claim comes from correlations between the structure of the historic field and the behavior of the paleomagnetic field recorded in precisely dated lavas at those times when the axial dipole was especially weak or nearly absent. It is argued that a "stratification" of magnetic sources exists in the fluid core such that the axial dipole is the only observed field component that is nearly immune from the influence exerted by the lowermost mantle. It follows that subsequent work on spherical harmonic-based field descriptions may now incorporate an understanding of a dichotomy of spatial-temporal dynamo processes. PMID:18818352
On merging rainfall data from diverse sources using a Bayesian approach
Bhattacharya, Biswa; Tarekegn, Tegegne
2014-05-01
Numerous studies have presented comparison of satellite rainfall products, such as from Tropical Rainfall Measuring Mission (TRMM), with rain gauge data and have concluded, in general, that the two sources of data are comparable at suitable space and time scales. The comparison is not a straightforward one as they employ different measurement techniques and are dependent on very different space-time scales of measurements. The number of available gauges in a catchment also influences the comparability and thus adds to the complexity. The TRMM rainfall data also has been directly used in hydrological modelling. As the space-time scale reduces so does the accuracy of these models. It seems that combining the two sources of rainfall data, or more sources of rainfall data, can enormously benefit hydrological studies. Various rainfall data, due to the differences in their space-time structure, contains information about the spatio-temporal distribution of rainfall, which is not available to a single source of data. In order to harness this benefit we have developed a method of merging these two (or more) rainfall products under the framework of Bayesian Data Fusion (BDF) principle. By applying this principle the rainfall data from the various sources can be combined to a single time series of rainfall data. The usefulness of the approach has been explored in a case study on Lake Tana Basin of Upper Blue Nile Basin in Ethiopia. A 'leave one rain gauge out' cross validation technique was employed for evaluating the accuracy of the rainfall time series with rainfall interpolated from rain gauge data using Inverse Distance Weighting (referred to as IDW), TRMM and the fused data (BDF). The result showed that BDF prediction was better compared to the TRMM and IDW. Further evaluation of the three rainfall estimates was done by evaluating the capability in predicting observed stream flow using a lumped conceptual rainfall-runoff model using NAM. Visual inspection of the
Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.
2015-12-01
Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.
Application of hierarchical Bayesian unmixing models in river sediment source apportionment
Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice
2016-04-01
Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling
Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L
2010-07-01
This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out. PMID:20649202
Adaptive blind source separation with HRTFs beamforming preprocessing
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-01-01
We propose an adaptive blind source separation algorithm in the context of robot audition using a microphone array. Our algorithm presents two steps: a fixed beamforming step to reduce the reverberation and the background noise and a source separation step. In the fixed beamforming preprocessing, we build the beamforming filters using the Head Related Transfer Functions (HRTFs) which allows us to take into consideration the effect of the robot's head on the near acoustic field. In the source ...
Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)
2012-09-15
The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)
Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.
2016-01-01
This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion
Separating More Sources Than Sensors Using Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Belouchrani Adel
2005-01-01
Full Text Available We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs. The underlying assumption is that the original sources are disjoint in the time-frequency (TF domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.
Separating More Sources Than Sensors Using Time-Frequency Distributions
Linh-Trung, Nguyen; Belouchrani, Adel; Abed-Meraim, Karim; Boashash, Boualem
2005-12-01
We examine the problem of blind separation of nonstationary sources in the underdetermined case, where there are more sources than sensors. Since time-frequency (TF) signal processing provides effective tools for dealing with nonstationary signals, we propose a new separation method that is based on time-frequency distributions (TFDs). The underlying assumption is that the original sources are disjoint in the time-frequency (TF) domain. The successful method recovers the sources by performing the following four main procedures. First, the spatial time-frequency distribution (STFD) matrices are computed from the observed mixtures. Next, the auto-source TF points are separated from cross-source TF points thanks to the special structure of these mixture STFD matrices. Then, the vectors that correspond to the selected auto-source points are clustered into different classes according to the spatial directions which differ among different sources; each class, now containing the auto-source points of only one source, gives an estimation of the TFD of this source. Finally, the source waveforms are recovered from their TFD estimates using TF synthesis. Simulated experiments indicate the success of the proposed algorithm in different scenarios. We also contribute with two other modified versions of the algorithm to better deal with auto-source point selection.
Single channel blind source separation based on ICA feature extraction
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A new technique is proposed to solve the blind source separation (BSS) given only a single channel observation. The basis functions and the density of the coefficients of source signals learned by ICA are used as the prior knowledge. Based on the learned prior information the learning rules of single channel BSS are presented by maximizing the joint log likelihood of the mixed sources to obtain source signals from single observation,in which the posterior density of the given measurements is maximized. The experimental results exhibit a successful separation performance for mixtures of speech and music signals.
Novel blind source separation algorithm using Gaussian mixture density function
Institute of Scientific and Technical Information of China (English)
孔薇; 杨杰; 周越
2004-01-01
The blind source separation (BSS) is an important task for numerous applications in signal processing, communications and array processing. But for many complex sources blind separation algorithms are not efficient because the probability distribution of the sources cannot be estimated accurately. So in this paper, to justify the ME(maximum enteropy) approach, the relation between the ME and the MMI(minimum mutual information) is elucidated first. Then a novel algorithm that uses Gaussian mixture density to approximate the probability distribution of the sources is presented based on the ME approach. The experiment of the BSS of ship-radiated noise demonstrates that the proposed algorithm is valid and efficient.
DEFF Research Database (Denmark)
Oh, Geok Lian
This PhD study examines the use of seismic technology for the problem of detecting underground facilities, whereby a seismic source such as a sledgehammer is used to generate seismic waves through the ground, sensed by an array of seismic sensors on the ground surface, and recorded by the digital...... device. The concept is similar to the techniques used in exploration seismology, in which explosions (that occur at or below the surface) or vibration wave-fronts generated at the surface reflect and refract off structures at the ground depth, so as to generate the ground profile of the elastic material...... density values of the discretized ground medium, which leads to time-consuming computations and instability behaviour of the inversion process. In addition, the geophysics inverse problem is generally ill-posed due to non-exact forward model that introduces errors. The Bayesian inversion method through...
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
International Nuclear Information System (INIS)
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report
Energy Technology Data Exchange (ETDEWEB)
Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)
2013-10-15
The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)
Anezaki, Katsunori; Nakano, Takeshi; Kashiwagi, Nobuhisa
2016-01-19
Using the chemical balance method, and considering the presence of unidentified sources, we estimated the origins of PCB contamination in surface sediments of Muroran Port, Japan. It was assumed that these PCBs originated from four types of Kanechlor products (KC300, KC400, KC500, and KC600), combustion and two kinds of pigments (azo and phthalocyanine). The characteristics of these congener patterns were summarized on the basis of principal component analysis and explanatory variables determined. A Bayesian semifactor model (CMBK2) was applied to the explanatory variables to analyze the sources of PCBs in the sediments. The resulting estimates of the contribution ratio of each kind of sediment indicate that the existence of unidentified sources can be ignored and that the assumed seven sources are adequate to account for the contamination. Within the port, the contribution ratio of KC500 and KC600 (used as paints for ship hulls) was extremely high, but outside the port, the influence of azo pigments was observable to a limited degree. This indicates that environmental PCBs not derived from technical PCBs are present at levels that cannot be ignored. PMID:26716388
Source separation of household waste: A case study in China
International Nuclear Information System (INIS)
A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference
Separation of synchronous sources through phase locked matrix factorization.
Almeida, Miguel S B; Vigário, Ricardo; Bioucas-Dias, José
2014-10-01
In this paper, we study the separation of synchronous sources (SSS) problem, which deals with the separation of sources whose phases are synchronous. This problem cannot be addressed through independent component analysis methods because synchronous sources are statistically dependent. We present a two-step algorithm, called phase locked matrix factorization (PLMF), to perform SSS. We also show that SSS is identifiable under some assumptions and that any global minimum of PLMFs cost function is a desirable solution for SSS. We extensively study the algorithm on simulated data and conclude that it can perform SSS with various numbers of sources and sensors and with various phase lags between the sources, both in the ideal (i.e., perfectly synchronous and nonnoisy) case, and with various levels of additive noise in the observed signals and of phase jitter in the sources. PMID:25291741
Prasad, Sudhakar
2014-01-01
We present an asymptotic analysis of the minimum probability of error (MPE) in inferring the correct hypothesis in a Bayesian multi-hypothesis testing (MHT) formalism using many pixels of data that are corrupted by signal dependent shot noise, sensor read noise, and background illumination. We perform this error analysis for a variety of combined noise and background statistics, including a pseudo-Gaussian distribution that can be employed to treat approximately the photon-counting statistics of signal and background as well as purely Gaussian sensor read-out noise and more general, exponentially peaked distributions. We subsequently apply the MPE asymptotics to characterize the minimum conditions needed to localize a point source in three dimensions by means of a rotating-PSF imager and compare its performance with that of a conventional imager in the presence of background and sensor-noise fluctuations. In a separate paper, we apply the formalism to the related but qualitatively different problem of 2D supe...
Abban, B.; (Thanos) Papanicolaou, A. N.; Cowles, M. K.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schilling, K.; Schnoebelen, D.
2016-06-01
An enhanced revision of the Fox and Papanicolaou (hereafter referred to as "F-P") (2008a) Bayesian, Markov Chain Monte Carlo fingerprinting framework for estimating sediment source contributions and their associated uncertainties is presented. The F-P framework included two key deterministic parameters, α and β, that, respectively, reflected the spatial origin attributes of sources and the time history of eroded material delivered to and collected at the watershed outlet. However, the deterministic treatment of α and β is limited to cases with well-defined spatial partitioning of sources, high sediment delivery, and relatively short travel times with little variability in transport within the watershed. For event-based studies in intensively managed landscapes, this may be inadequate since landscape heterogeneity results in variabilities in source contributions, their pathways, delivery times, and storage within the watershed. Thus, probabilistic treatments of α and β are implemented in the enhanced framework to account for these variabilities. To evaluate the effects of the treatments of α and β on source partitioning, both frameworks are applied to the South Amana Subwatershed (SASW) in the U.S. midwest. The enhanced framework is found to estimate mean source contributions that are in good agreement with estimates from other studies in SASW. The enhanced framework is also able to produce expected trends in uncertainty during the study period, unlike the F-P framework, which does not perform as expected. Overall, the enhanced framework is found to be less sensitive to changes in α and β than the F-P framework, and, therefore, is more robust and desirable from a management standpoint.
Directory of Open Access Journals (Sweden)
Rasheda Arman Chowdhury
Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real
Extended Nonnegative Tensor Factorisation Models for Musical Sound Source Separation
Directory of Open Access Journals (Sweden)
Derry FitzGerald
2008-01-01
Full Text Available Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.
International Nuclear Information System (INIS)
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented
Energy Technology Data Exchange (ETDEWEB)
George, J.S.; Schmidt, D.M.; Wood, C.C.
1999-02-01
We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented.
Czech Academy of Sciences Publication Activity Database
Koldovský, Zbyněk; Tichavský, Petr; Málek, J.
Vol. 6365. Heidelberg : Springer-Verlag, 2010 - (Gavrilova, M.; Kumar, V.; Mun, Y.; Tan, C.; Gervasi, O.), s. 17-24 ISBN 978-3-642-15994-7. [Latent Variable Analysis and Signal Separation. St. Malo (FR), 27.09.2010-30.09.2010] R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * audio * convolutive mixture Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/SI/tichavsky-time-domain blind audio source separation method producing separating filters of generalized feedforward structure.pdf
Hosseini-zad, K.; Stähler, S. C.; Sigloch, K.; Scheingraber, C.
2012-04-01
Seismic tomography has made giant progresses in the last decade. This has been due to improvements in the method, which allowed to combine the high information content of waveform modeling with the mathematically sound methods of tomographic inversion. The second factor is the vast growth of digitally available broadband seismograms. Both factors together require efficient processing schemes for seismic waveforms, which reduce the necessary manual interaction to a minimum. Since the data growth has mainly taken place on traditionally well instrumented regions, many areas are still sparsely instrumented, so the processing scheme should treat all data with highest care. Our processing scheme "No data left behind", which is implemented in Python and incorporated into the seismology package ObsPy automates the steps for global or regional body wave tomography: 1. Data retrieval: Downloading of event-based seismic waveforms from ORFEUS and IRIS. This way around 1600 stations globally are available. Data from other sources can be added manually. 2. Preprocessing: Deconvolution of instrument responses, recognition of bad recordings and automated correction, if possible. No rejection is done in this stage. 3. Cutting of time windows around body wave phases, decomposition of the signals into 6 frequency bands (20s to 1 Hz), individual determination of SNR and similarity to synthetic waveforms. 4. Rejection of bad windows. Since the rejection is done based on SNR or CC with synthetics independently for each of the 6 frequency bands, even very noisy stations like ocean islands are not discarded completely. 5. Bayesian Source inversion: The source parameters including depth, CMT and Source Time Function are determined in a probabilistic way using a wavelet base and P- and SH-waveforms. The whole algorithm is modular and additional modules (e.g. for OBS preprocessing) can be selected individually.
Separation of core and crustal magnetic field sources
Shure, L.; Parker, R. L.; Langel, R. A.
1985-01-01
Fluid motions in the electrically conducting core and magnetized crustal rocks are the two major sources of the magnetic field observed on or slightly above the Earth's surface. The exact separation of these two contributions is not possible without imposing a priori assumptions about the internal source distribution. Nonetheless models like these were developed for hundreds of years Gauss' method, least squares analysis with a truncated spherical harmonic expansion was the method of choice for more than 100 years although he did not address separation of core and crustal sources, but rather internal versus external ones. Using some arbitrary criterion for appropriate truncation level, we now extrapolate downward core field models through the (approximately) insulating mantle. Unfortunately our view can change dramatically depending on the degree of truncation for describing core sources.
Sparsity and Morphological Diversity in Blind Source Separation
Bobin, Jérome; Starck, Jean-Luc; Fadili, Jalal M.; Moudden, Yassir
2007-01-01
Over the last few years, the development of multichannel sensors motivated interest in methods for the coherent processing of multivariate data. Some specific issues have already been addressed as testified by the wide literature on the so-caIled blind source separation (BSS) problem. In this context, as clearly emphasized by previous work, it is fundamental that the sources to be retrieved present some quantitatively measurable diversity. Recently, sparsity and morphological diversity have e...
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Abdeldjalil Aïssa-El-Bey; Karim Abed-Meraim; Yves Grenier
2007-01-01
This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal) components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components) followed by a signal synthesis (grouping of the components belonging to the same source) using vector clustering. For the signa...
Jutten, Christian; Karhunen, Juha
2004-10-01
In this paper, we review recent advances in blind source separation (BSS) and independent component analysis (ICA) for nonlinear mixing models. After a general introduction to BSS and ICA, we discuss in more detail uniqueness and separability issues, presenting some new results. A fundamental difficulty in the nonlinear BSS problem and even more so in the nonlinear ICA problem is that they provide non-unique solutions without extra constraints, which are often implemented by using a suitable regularization. In this paper, we explore two possible approaches. The first one is based on structural constraints. Especially, post-nonlinear mixtures are an important special case, where a nonlinearity is applied to linear mixtures. For such mixtures, the ambiguities are essentially the same as for the linear ICA or BSS problems. The second approach uses Bayesian inference methods for estimating the best statistical parameters, under almost unconstrained models in which priors can be easily added. In the later part of this paper, various separation techniques proposed for post-nonlinear mixtures and general nonlinear mixtures are reviewed. PMID:15593377
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Directory of Open Access Journals (Sweden)
Aïssa-El-Bey Abdeldjalil
2007-01-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
Underdetermined Blind Audio Source Separation Using Modal Decomposition
Directory of Open Access Journals (Sweden)
Abdeldjalil Aïssa-El-Bey
2007-03-01
Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.
How Many Separable Sources? Model Selection In Independent Components Analysis
DEFF Research Database (Denmark)
Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen
2015-01-01
Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis...... computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher’s iris data set and Howells’ craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources...... might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian....
Troldborg, M.; Nowak, W.; Binning, P. J.; Bjerg, P. L.
2012-12-01
Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions of both concentration and groundwater flow velocities. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across a multilevel control plane. The method decouples the flow and transport simulation and has the advantage of avoiding the heavy computational burden of three-dimensional numerical flow and transport simulation coupled with geostatistical inversion. It may therefore be of practical relevance to practitioners compared to existing methods that are either too simple or computationally demanding. The method is based on conditional geostatistical simulation and accounts for i) heterogeneity of both the flow field and the concentration distribution through Bayesian geostatistics (including the uncertainty in covariance functions), ii) measurement uncertainty, and iii) uncertain source zone geometry and transport parameters. The method generates multiple equally likely realizations of the spatial flow and concentration distribution, which all honour the measured data at the control plane. The flow realizations are generated by analytical co-simulation of the hydraulic conductivity and the hydraulic gradient across the control plane. These realizations are made consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box-Cox transformed concentration data is used to simulate observed
International Nuclear Information System (INIS)
A well-tested and validated Gibbs sampling code, that performs component separation and cosmic microwave background (CMB) power spectrum estimation, was applied to the WMAP five-year data. Using a simple model consisting of CMB, noise, monopoles, and dipoles, a 'per pixel' low-frequency power-law (fitting for both amplitude and spectral index), and a thermal dust template with a fixed spectral index, we found that the low-l (l < 50) CMB power spectrum is in good agreement with the published WMAP5 results. Residual monopoles and dipoles were found to be small (∼<3 μK) or negligible in the five-year data. We comprehensively tested the assumptions that were made about the foregrounds (e.g., dust spectral index, power-law spectral index prior, templates), and found that the CMB power spectrum was insensitive to these choices. We confirm the asymmetry of power between the north and south ecliptic hemispheres, which appears to be robust against foreground modeling. The map of low-frequency spectral indices indicates a steeper spectrum on average (β = -2.97 ± 0.21) relative to those found at low (∼GHz) frequencies.
Jank, Anna; Müller, Wolfgang; Schneider, Irene; Gerke, Frederic; Bockreis, Anke
2015-05-01
An efficient biological treatment of source separated organic waste from household kitchens and gardens (biowaste) requires an adequate upfront mechanical preparation which possibly includes a hand sorting for the separation of contaminants. In this work untreated biowaste from households and gardens and the screen overflow >60mm of the same waste were mechanically treated by a Waste Separation Press (WSP). The WSP separates the waste into a wet fraction for biological treatment and a fraction of dry contaminants for incineration. The results show that it is possible to replace a hand sorting of contaminants, the milling and a screening of organic waste before the biological treatment by using the WSP. A special focus was put on the contaminants separation. The separation of plastic film from the untreated biowaste was 67% and the separation rate of glass was about 92%. About 90% of the organics were transferred to the fraction for further biological treatment. When treating the screen overflow >60mm with the WSP 86% of the plastic film and 88% of the glass were transferred to the contaminants fraction. 32% of the organic was transferred to the contaminants fraction and thereby lost for a further biological treatment. Additionally it was calculated that national standards for glass contaminants in compost can be met when using the WSP to mechanically treat the total biowaste. The loss of biogas by transferring biodegradable organics to the contaminants fraction was about 11% when preparing the untreated biowaste with the WSP. PMID:25761398
FREQUENCY OVERLAPPED SIGNAL IDENTIFICATION USING BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
WANG Junfeng; SHI Tielin; HE Lingsong; YANG Shuzi
2006-01-01
The concepts, principles and usages of principal component analysis (PCA) and independent component analysis (ICA) are interpreted. Then the algorithm and methodology of ICA-based blind source separation (BSS), in which the pre-whitened based on PCA for observed signals is used, are researched. Aiming at the mixture signals, whose frequency components are overlapped by each other, a simulation of BSS to separate this type of mixture signals by using theory and approach of BSS has been done. The result shows that the BSS has some advantages what the traditional methodology of frequency analysis has not.
Source separation and clustering of phase-locked subspaces.
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José Mario; Vigário, Ricardo
2011-09-01
It has been proven that there are synchrony (or phase-locking) phenomena present in multiple oscillating systems such as electrical circuits, lasers, chemical reactions, and human neurons. If the measurements of these systems cannot detect the individual oscillators but rather a superposition of them, as in brain electrophysiological signals (electro- and magneoencephalogram), spurious phase locking will be detected. Current source-extraction techniques attempt to undo this superposition by assuming properties on the data, which are not valid when underlying sources are phase-locked. Statistical independence of the sources is one such invalid assumption, as phase-locked sources are dependent. In this paper, we introduce methods for source separation and clustering which make adequate assumptions for data where synchrony is present, and show with simulated data that they perform well even in cases where independent component analysis and other well-known source-separation methods fail. The results in this paper provide a proof of concept that synchrony-based techniques are useful for low-noise applications. PMID:21791409
A source separation approach to enhancing marine mammal vocalizations.
Gur, M Berke; Niezrecki, Christopher
2009-12-01
A common problem in passive acoustic based marine mammal monitoring is the contamination of vocalizations by a noise source, such as a surface vessel. The conventional approach in improving the vocalization signal to noise ratio (SNR) is to suppress the unwanted noise sources by beamforming the measurements made using an array. In this paper, an alternative approach to multi-channel underwater signal enhancement is proposed. Specifically, a blind source separation algorithm that extracts the vocalization signal from two-channel noisy measurements is derived and implemented. The proposed algorithm uses a robust decorrelation criterion to separate the vocalization from background noise, and hence is suitable for low SNR measurements. To overcome the convergence limitations resulting from temporally correlated recordings, the supervised affine projection filter update rule is adapted to the unsupervised source separation framework. The proposed method is evaluated using real West Indian manatee (Trichechus manatus latirostris) vocalizations and watercraft emitted noise measurements made within a typical manatee habitat in Florida. The results suggest that the proposed algorithm can improve the detection range of a passive acoustic detector five times on average (for input SNR between -10 and 5 dB) using only two receivers. PMID:20000920
An autonomous surveillance system for blind sources localization and separation
Wu, Sean; Kulkarni, Raghavendra; Duraiswamy, Srikanth
2013-05-01
This paper aims at developing a new technology that will enable one to conduct an autonomous and silent surveillance to monitor sound sources stationary or moving in 3D space and a blind separation of target acoustic signals. The underlying principle of this technology is a hybrid approach that uses: 1) passive sonic detection and ranging method that consists of iterative triangulation and redundant checking to locate the Cartesian coordinates of arbitrary sound sources in 3D space, 2) advanced signal processing to sanitizing the measured data and enhance signal to noise ratio, and 3) short-time source localization and separation to extract the target acoustic signals from the directly measured mixed ones. A prototype based on this technology has been developed and its hardware includes six B and K 1/4-in condenser microphones, Type 4935, two 4-channel data acquisition units, Type NI-9234, with a maximum sampling rate of 51.2kS/s per channel, one NI-cDAQ 9174 chassis, a thermometer to measure the air temperature, a camera to view the relative positions of located sources, and a laptop to control data acquisition and post processing. Test results for locating arbitrary sound sources emitting continuous, random, impulsive, and transient signals, and blind separation of signals in various non-ideal environments is presented. This system is invisible to any anti-surveillance device since it uses the acoustic signal emitted by a target source. It can be mounted on a robot or an unmanned vehicle to perform various covert operations, including intelligence gathering in an open or a confined field, or to carry out the rescue mission to search people trapped inside ruins or buried under wreckages.
Evaluating source separation of plastic waste using conjoint analysis.
Nakatani, Jun; Aramaki, Toshiya; Hanaki, Keisuke
2008-11-01
Using conjoint analysis, we estimated the willingness to pay (WTP) of households for source separation of plastic waste and the improvement of related environmental impacts, the residents' loss of life expectancy (LLE), the landfill capacity, and the CO2 emissions. Unreliable respondents were identified and removed from the sample based on their answers to follow-up questions. It was found that the utility associated with reducing LLE and with the landfill capacity were both well expressed by logarithmic functions, but that residents were indifferent to the level of CO2 emissions even though they approved of CO2 reduction. In addition, residents derived utility from the act of separating plastic waste, irrespective of its environmental impacts; that is, they were willing to practice the separation of plastic waste at home in anticipation of its "invisible effects", such as the improvement of citizens' attitudes toward solid waste issues. PMID:18207727
Blind source separation advances in theory, algorithms and applications
Wang, Wenwu
2014-01-01
Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.
Blind source separation for robot audition using fixed HRTF beamforming
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-12-01
In this article, we present a two-stage blind source separation (BSS) algorithm for robot audition. The first stage consists in a fixed beamforming preprocessing to reduce the reverberation and the environmental noise. Since we are in a robot audition context, the manifold of the sensor array in this case is hard to model due to the presence of the head of the robot, so we use pre-measured head related transfer functions (HRTFs) to estimate the beamforming filters. The use of the HRTF to estimate the beamformers allows to capture the effect of the head on the manifold of the microphone array. The second stage is a BSS algorithm based on a sparsity criterion which is the minimization of the l 1 norm of the sources. We present different configuration of our algorithm and we show that it has promising results and that the fixed beamforming preprocessing improves the separation results.
Energy Technology Data Exchange (ETDEWEB)
Blidholm, O.; Wiklund, S.E. [AaF-Energikonsult (Sweden); Bauer, A.C. [Energikonsult A. Bauer (Sweden)
1997-02-01
The basic idea of this project is to study the possibilities to use source separated combustible material for energy conversion in conventional solid fuel boilers (i.e. not municipal waste incineration plants). The project has been carried out in three phases. During phase 1 and 2 a number of fuel analyses of different fractions were carried out. During phase 3 two combustion tests were carried out; (1) a boiler with grate equipped with cyclone, electrostatic precipitator and flue gas condenser, and (2) a bubbling fluidized bed boiler with electrostatic precipitator and flue gas condenser. During the tests source separated paper and plastic packagings were co-fired with biomass fuels. The mixing rate of packagings was approximately 15%. This study reports the results of phase 3 and the conclusions of the whole project. The technical terms of using packaging as fuel are good. The technique is available for shredding both paper and plastic packaging. The material can be co-fired with biomass. The economical terms of using source separated packaging for energy conversion can be very advantageous, but can also form obstacles. The result is to a high degree guided by such facts as how the fuel is collected, transported, reduced in size and handled at the combustion plant. The results of the combustion tests show that the environmental terms of using source separated packaging for energy conversion are good. The emissions of heavy metals into the atmosphere are very low. The emissions are well below the emission standards for waste incineration plants. 35 figs, 13 tabs, 8 appendices
Meillier, Céline; Chatelain, Florent; Michel, Olivier; Bacon, Roland; Piqueras, Laure; Bacher, Raphael; Ayasso, Hacheme
2016-04-01
We present SELFI, the Source Emission Line FInder, a new Bayesian method optimized for detection of faint galaxies in Multi Unit Spectroscopic Explorer (MUSE) deep fields. MUSE is the new panoramic integral field spectrograph at the Very Large Telescope (VLT) that has unique capabilities for spectroscopic investigation of the deep sky. It has provided data cubes with 324 million voxels over a single 1 arcmin2 field of view. To address the challenge of faint-galaxy detection in these large data cubes, we developed a new method that processes 3D data either for modeling or for estimation and extraction of source configurations. This object-based approach yields a natural sparse representation of the sources in massive data fields, such as MUSE data cubes. In the Bayesian framework, the parameters that describe the observed sources are considered random variables. The Bayesian model leads to a general and robust algorithm where the parameters are estimated in a fully data-driven way. This detection algorithm was applied to the MUSE observation of Hubble Deep Field-South. With 27 h total integration time, these observations provide a catalog of 189 sources of various categories and with secured redshift. The algorithm retrieved 91% of the galaxies with only 9% false detection. This method also allowed the discovery of three new Lyα emitters and one [OII] emitter, all without any Hubble Space Telescope counterpart. We analyzed the reasons for failure for some targets, and found that the most important limitation of the method is when faint sources are located in the vicinity of bright spatially resolved galaxies that cannot be approximated by the Sérsic elliptical profile. The software and its documentation are available on the MUSE science web service (muse-vlt.eu/science).
International Nuclear Information System (INIS)
Acoustic emission (AE) is a well-established nondestructive testing method for assessing the condition of liquid-filled tanks. Often the tank can be tested without the need for accurate location of AE sources. But sometimes, accurate location is required, such as in the case of follow-up inspections after AE has indicated a significant defect. Traditional computed location techniques that considered only the wave traveling through the shell of the tank have not proved reliable when applied to liquid-filled tanks. This because AE sensors are often responding to liquid-borne waves, that are not considered in the traditional algorithms. This paper describes an approach for locating AE sources on the wall of liquid filled tanks that includes two novel aspects: (i) the use of liquid-borne waves, and (ii) the use of a probabilistic algorithm. The proposed algorithm is developed within a Bayesian framework that considers uncertainties in the wave velocities and the time of arrival. A Markov Chain Monte Carlo is used to estimate the distribution of the AE source location. This approach was applied on a 102 inch diameter (29 000 gal) railroad tank car by estimating the source locations from pencil lead break with waveforms recorded. Results show that the proposed Bayesian approach for source location can be used to calculate the most probable region of the tank wall where the AE source is located. (paper)
Defining the force between separated sources on a light front
International Nuclear Information System (INIS)
The Newtonian character of gauge theories on a light front requires that the longitudinal momentum P+, which plays the role of Newtonian mass, be conserved. This requirement conflicts with the standard definition of the force between two sources in terms of the minimal energy of quantum gauge fields in the presence of a quark and anti-quark pinned to points separated by a distance R. We propose that, on a light front, the force be defined by minimizing the energy of gauge fields in the presence of a quark and an anti-quark pinned to lines (1-branes) oriented in the longitudinal direction singled out by the light front and separated by a transverse distance R. Such sources will have a limited 1+1 dimensional dynamics. We study this proposal for weak coupling gauge theories by showing how it leads to the Coulomb force law. For QCD we also show how asymptotic freedom emerges by evaluating the S matrix through one loop for the scattering of a particle in the Nc representation of color SU(Nc) on a 1-brane by a particle in the bar Nc representation of color on a parallel 1-brane separated from the first by a distance RQCD. Potential applications to the problem of confinement on a light front are discussed. copyright 1999 The American Physical Society
International Nuclear Information System (INIS)
To identify different NO3− sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ15N–NO3− were between 8.0 and 19.4‰, while annual mean δ18O–NO3− were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO3− sources (NO3− in precipitation, NO3− fertilizer, NH4+ in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO3− fertilizer” and “NH4+ in fertilizer and rain” contributed middle, and “NO3− in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO3− source contributions. However, the wide range of isotope values observed in surface water and of the NO3− sources limit its applicability. - Highlights: ► The dual isotope approach (δ15N- and δ18O–NO3−) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO3− sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.
Blind Separation of Piecewise Stationary NonGaussian Sources
Czech Academy of Sciences Publication Activity Database
Koldovský, Zbyněk; Málek, J.; Tichavský, Petr; Deville, Y.; Hosseini, S.
2009-01-01
Roč. 89, č. 12 (2009), s. 2570-2584. ISSN 0165-1684 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Grant ostatní: GA ČR(CZ) GA102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : Independent component analysis * blind source separation * Cramer-Rao lower bound Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.135, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/tichavsky-blind separationofpiecewisestationarynon-gaussiansources.pdf
Blind Source Separation for Robot Audition using fixed HRTF beamforming
Maazaoui, Mounira; Abed-Meraim, Karim; Grenier, Yves
2012-01-01
In this article, we present a two-stage blind source separation (BSS) algorithm for robot audition. The first stage consists in a fixed beamforming preprocessing to reduce the reverberation and the environmental noise. Since we are in a robot audition context, the manifold of the sensor array in this case is hard to model due to the presence of the head of the robot, so we use pre-measured head related transfer functions (HRTFs) to estimate the beamforming filters. The use of the HRTF to esti...
DEFF Research Database (Denmark)
Pires, Sara Monteiro; Hald, Tine
2010-01-01
disease. These differences presumably represent multiple factors, such as differences in survivability through the food chain and/or pathogenicity. The relative importance of the source-dependent factors varied considerably over the years, reflecting, among others, variability in the surveillance programs......Salmonella is a major cause of human gastroenteritis worldwide. To prioritize interventions and assess the effectiveness of efforts to reduce illness, it is important to attribute salmonellosis to the responsible sources. Studies have suggested that some Salmonella subtypes have a higher health...... impact than others. Likewise, some food sources appear to have a higher impact than others. Knowledge of variability in the impact of subtypes and sources may provide valuable added information for research, risk management, and public health strategies. We developed a Bayesian model that attributes...
The Leuven isotope separator on-line laser ion source
International Nuclear Information System (INIS)
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. 54,55Ni and 54,55Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of 238U. The b decay of the 68-74Ni, 67-70Co, 70-75Cu and 110-114Rh isotopes has been studied by means of β-γ and γ-γ spectroscopy. Recently, the laser ion source has been used to produce neutron-deficient rhodium and ruthenium isotopes (91-95Rh, 98Rh, 90,91Ru) near the N=Z line in heavy ion-induced fusion reactions
Compressive Source Separation: Theory and Methods for Hyperspectral Imaging
Golbabaee, Mohammad; Arberet, Simon; Vandergheynst, Pierre
2013-12-01
With the development of numbers of high resolution data acquisition systems and the global requirement to lower the energy consumption, the development of efficient sensing techniques becomes critical. Recently, Compressed Sampling (CS) techniques, which exploit the sparsity of signals, have allowed to reconstruct signal and images with less measurements than the traditional Nyquist sensing approach. However, multichannel signals like Hyperspectral images (HSI) have additional structures, like inter-channel correlations, that are not taken into account in the classical CS scheme. In this paper we exploit the linear mixture of sources model, that is the assumption that the multichannel signal is composed of a linear combination of sources, each of them having its own spectral signature, and propose new sampling schemes exploiting this model to considerably decrease the number of measurements needed for the acquisition and source separation. Moreover, we give theoretical lower bounds on the number of measurements required to perform reconstruction of both the multichannel signal and its sources. We also proposed optimization algorithms and extensive experimentation on our target application which is HSI, and show that our approach recovers HSI with far less measurements and computational effort than traditional CS approaches.
The Leuven isotope separator on-line laser ion source
Kudryavtsev, Y; Franchoo, S; Huyse, M; Gentens, J; Kruglov, K; Müller, W F; Prasad, N V S; Raabe, R; Reusen, I; Van den Bergh, P; Van Duppen, P; Van Roosbroeck, J; Vermeeren, L; Weissman, L
2002-01-01
An element-selective laser ion source has been used to produce beams of exotic radioactive nuclei and to study their decay properties. The operational principle of the ion source is based on selective resonant laser ionization of nuclear reaction products thermalized and neutralized in a noble gas at high pressure. The ion source has been installed at the Leuven Isotope Separator On-Line (LISOL), which is coupled on-line to the cyclotron accelerator at Louvain-la-Neuve. sup 5 sup 4 sup , sup 5 sup 5 Ni and sup 5 sup 4 sup , sup 5 sup 5 Co isotopes were produced in light-ion-induced fusion reactions. Exotic nickel, cobalt and copper nuclei were produced in proton-induced fission of sup 2 sup 3 sup 8 U. The b decay of the sup 6 sup 8 sup - sup 7 sup 4 Ni, sup 6 sup 7 sup - sup 7 sup 0 Co, sup 7 sup 0 sup - sup 7 sup 5 Cu and sup 1 sup 1 sup 0 sup - sup 1 sup 1 sup 4 Rh isotopes has been studied by means of beta-gamma and gamma-gamma spectroscopy. Recently, the laser ion source has been used to produce neutron-d...
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...
Exploiting Narrowband Efficiency for Broadband Convolutive Blind Source Separation
Directory of Open Access Journals (Sweden)
Aichner Robert
2007-01-01
Full Text Available Based on a recently presented generic broadband blind source separation (BSS algorithm for convolutive mixtures, we propose in this paper a novel algorithm combining advantages of broadband algorithms with the computational efficiency of narrowband techniques. By selective application of the Szegö theorem which relates properties of Toeplitz and circulant matrices, a new normalization is derived as a special case of the generic broadband algorithm. This results in a computationally efficient and fast converging algorithm without introducing typical narrowband problems such as the internal permutation problem or circularity effects. Moreover, a novel regularization method for the generic broadband algorithm is proposed and subsequently also derived for the proposed algorithm. Experimental results in realistic acoustic environments show improved performance of the novel algorithm compared to previous approximations.
Iterative compressive sampling for hyperspectral images via source separation
Kamdem Kuiteing, S.; Barni, Mauro
2014-03-01
Compressive Sensing (CS) is receiving increasing attention as a way to lower storage and compression requirements for on-board acquisition of remote-sensing images. In the case of multi- and hyperspectral images, however, exploiting the spectral correlation poses severe computational problems. Yet, exploiting such a correlation would provide significantly better performance in terms of reconstruction quality. In this paper, we build on a recently proposed 2D CS scheme based on blind source separation to develop a computationally simple, yet accurate, prediction-based scheme for acquisition and iterative reconstruction of hyperspectral images in a CS setting. Preliminary experiments carried out on different hyperspectral images show that our approach yields a dramatic reduction of computational time while ensuring reconstruction performance similar to those of much more complicated 3D reconstruction schemes.
Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D
2015-08-01
Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted. PMID:25391388
Blind source separation with unknown and dynamically changing number of source signals
Institute of Scientific and Technical Information of China (English)
YE Jimin; ZHANG Xianda; ZHU Xiaolong
2006-01-01
The contrast function remains to be an open problem in blind source separation (BSS) when the number of source signals is unknown and/or dynamically changed.The paper studies this problem and proves that the mutual information is still the contrast function for BSS if the mixing matrix is of full column rank. The mutual information reaches its minimum at the separation points, where the random outputs of the BSS system are the scaled and permuted source signals, while the others are zero outputs. Using the property that the transpose of the mixing matrix and a matrix composed by m observed signals have the indentical null space with probability one, a practical method, which can detect the unknown number of source signals n, ulteriorly traces the dynamical change of the sources number with a few of data, is proposed. The effectiveness of the proposed theorey and the developed novel algorithm is verified by adaptive BSS simulations with unknown and dynamically changing number of source signals.
Decentralized modal identification using sparse blind source separation
International Nuclear Information System (INIS)
Popular ambient vibration-based system identification methods process information collected from a dense array of sensors centrally to yield the modal properties. In such methods, the need for a centralized processing unit capable of satisfying large memory and processing demands is unavoidable. With the advent of wireless smart sensor networks, it is now possible to process information locally at the sensor level, instead. The information at the individual sensor level can then be concatenated to obtain the global structure characteristics. A novel decentralized algorithm based on wavelet transforms to infer global structure mode information using measurements obtained using a small group of sensors at a time is proposed in this paper. The focus of the paper is on algorithmic development, while the actual hardware and software implementation is not pursued here. The problem of identification is cast within the framework of under-determined blind source separation invoking transformations of measurements to the time–frequency domain resulting in a sparse representation. The partial mode shape coefficients so identified are then combined to yield complete modal information. The transformations are undertaken using stationary wavelet packet transform (SWPT), yielding a sparse representation in the wavelet domain. Principal component analysis (PCA) is then performed on the resulting wavelet coefficients, yielding the partial mixing matrix coefficients from a few measurement channels at a time. This process is repeated using measurements obtained from multiple sensor groups, and the results so obtained from each group are concatenated to obtain the global modal characteristics of the structure
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Directory of Open Access Journals (Sweden)
Chris von Borgstede
2012-06-01
Full Text Available Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS. In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to employees at different workplaces. The results show that respondents with awareness of EMS report higher levels of source separation at work, stronger environmental concern, personal and social norms, and perceive source separation to be less difficult. Furthermore, the results support the notion that after the adoption of EMS at the workplace, source separation at work spills over into source separation in the household. The potential implications for environmental management systems are discussed.
Using the FASST source separation toolbox for noise robust speech recognition
Ozerov, Alexey; Vincent, Emmanuel
2011-01-01
We describe our submission to the 2011 CHiME Speech Separation and Recognition Challenge. Our speech separation algorithm was built using the Flexible Audio Source Separation Toolbox (FASST) we developed recently. This toolbox is an implementation of a general flexible framework based on a library of structured source models that enable the incorporation of prior knowledge about a source separation problem via user-specifiable constraints. We show how to use FASST to develop an efficient spee...
Residents’ Household Solid Waste (HSW Source Separation Activity: A Case Study of Suzhou, China
Directory of Open Access Journals (Sweden)
Hua Zhang
2014-09-01
Full Text Available Though the Suzhou government has provided household solid waste (HSW source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residents’ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% of the respondents in Suzhou thought they knew how to source separate HSW, and 29% of them have source separated HSW accurately. The results also found that the current HSW source separation pilot program in Suzhou is valid, as HSW source separation facilities and residents’ separation behavior both became better and better along with the program implementation. The main determinants of residents’ HSW source separation behavior are residents’ age, HSW source separation facilities and government preferential policies. The accessibility to waste management service is particularly important. Attitudes and willingness do not have significant impacts on residents’ HSW source separation behavior.
Wood ash as a magnesium source for phosphorus recovery from source-separated urine.
Sakthivel, S Ramesh; Tilley, Elizabeth; Udert, Kai M
2012-03-01
Struvite precipitation is a simple technology for phosphorus recovery from source-separated urine. However, production costs can be high if expensive magnesium salts are used as precipitants. Therefore, waste products can be interesting alternatives to industrially-produced magnesium salts. We investigated the technical and financial feasibility of wood ash as a magnesium source in India. In batch experiments with source-separated urine, we could precipitate 99% of the phosphate with a magnesium dosage of 2.7 mol Mg mol P(-1). The availability of the magnesium from the wood ash used in our experiment was only about 50% but this could be increased by burning the wood at temperatures well above 600 °C. Depending on the wood ash used, the precipitate can contain high concentrations of heavy metals. This could be problematic if the precipitate were used as fertilizer depending on the applicable fertilizer regulations. The financial study revealed that wood ash is considerably cheaper than industrially-produced magnesium sources and even cheaper than bittern. However, the solid precipitated with wood ash is not pure struvite. Due to the high calcite and the low phosphorus content (3%), the precipitate would be better used as a phosphorus-enhanced conditioner for acidic soils. The estimated fertilizer value of the precipitate was actually slightly lower than wood ash, because 60% of the potassium dissolved into solution during precipitation and was not present in the final product. From a financial point of view and due to the high heavy metal content, wood ash is not a very suitable precipitant for struvite production. Phosphate precipitation from urine with wood ash can be useful if (1) a strong need for a soil conditioner that also contains phosphate exists, (2) potassium is abundant in the soil and (3) no other cheap precipitant, such as bittern or magnesium oxide, is available. PMID:22297249
A Bayesian Approach to Discovering Truth from Conflicting Sources for Data Integration
Zhao, Bo; Gemmell, Jim; Han, Jiawei
2012-01-01
In practical data integration systems, it is common for the data sources being integrated to provide conflicting information about the same entity. Consequently, a major challenge for data integration is to derive the most complete and accurate integrated records from diverse and sometimes conflicting sources. We term this challenge the truth finding problem. We observe that some sources are generally more reliable than others, and therefore a good model of source quality is the key to solving the truth finding problem. In this work, we propose a probabilistic graphical model that can automatically infer true records and source quality without any supervision. In contrast to previous methods, our principled approach leverages a generative process of two types of errors (false positive and false negative) by modeling two different aspects of source quality. In so doing, ours is also the first approach designed to merge multi-valued attribute types. Our method is scalable, due to an efficient sampling-based inf...
2014-01-01
Time series studies have suggested that air pollution can negatively impact health. These studies have typically focused on the total mass of fine particulate matter air pollution or the individual chemical constituents that contribute to it, and not source-specific contributions to air pollution. Source-specific contribution estimates are useful from a regulatory standpoint by allowing regulators to focus limited resources on reducing emissions from sources that are major cont...
The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings
Chris von Borgstede; Maria Andersson; Ola Eriksson
2012-01-01
Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS). In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to empl...
Synthesis of blind source separation algorithms on reconfigurable FPGA platforms
Du, Hongtao; Qi, Hairong; Szu, Harold H.
2005-03-01
Recent advances in intelligence technology have boosted the development of micro- Unmanned Air Vehicles (UAVs) including Sliver Fox, Shadow, and Scan Eagle for various surveillance and reconnaissance applications. These affordable and reusable devices have to fit a series of size, weight, and power constraints. Cameras used on such micro-UAVs are therefore mounted directly at a fixed angle without any motion-compensated gimbals. This mounting scheme has resulted in the so-called jitter effect in which jitter is defined as sub-pixel or small amplitude vibrations. The jitter blur caused by the jitter effect needs to be corrected before any other processing algorithms can be practically applied. Jitter restoration has been solved by various optimization techniques, including Wiener approximation, maximum a-posteriori probability (MAP), etc. However, these algorithms normally assume a spatial-invariant blur model that is not the case with jitter blur. Szu et al. developed a smart real-time algorithm based on auto-regression (AR) with its natural generalization of unsupervised artificial neural network (ANN) learning to achieve restoration accuracy at the sub-pixel level. This algorithm resembles the capability of the human visual system, in which an agreement between the pair of eyes indicates "signal", otherwise, the jitter noise. Using this non-statistical method, for each single pixel, a deterministic blind sources separation (BSS) process can then be carried out independently based on a deterministic minimum of the Helmholtz free energy with a generalization of Shannon's information theory applied to open dynamic systems. From a hardware implementation point of view, the process of jitter restoration of an image using Szu's algorithm can be optimized by pixel-based parallelization. In our previous work, a parallelly structured independent component analysis (ICA) algorithm has been implemented on both Field Programmable Gate Array (FPGA) and Application
Korth, F.; Deutsch, B.; Frey, C.; Moros, C.; Voss, M.
2014-09-01
Nitrate (NO3-) is the major nutrient responsible for coastal eutrophication worldwide and its production is related to intensive food production and fossil-fuel combustion. In the Baltic Sea NO3- inputs have increased 4-fold over recent decades and now remain constantly high. NO3- source identification is therefore an important consideration in environmental management strategies. In this study focusing on the Baltic Sea, we used a method to estimate the proportional contributions of NO3- from atmospheric deposition, N2 fixation, and runoff from pristine soils as well as from agricultural land. Our approach combines data on the dual isotopes of NO3- (δ15N-NO3- and δ18O-NO3-) in winter surface waters with a Bayesian isotope mixing model (Stable Isotope Analysis in R, SIAR). Based on data gathered from 47 sampling locations over the entire Baltic Sea, the majority of the NO3- in the southern Baltic was shown to derive from runoff from agricultural land (33-100%), whereas in the northern Baltic, i.e. the Gulf of Bothnia, NO3- originates from nitrification in pristine soils (34-100%). Atmospheric deposition accounts for only a small percentage of NO3- levels in the Baltic Sea, except for contributions from northern rivers, where the levels of atmospheric NO3- are higher. An additional important source in the central Baltic Sea is N2 fixation by diazotrophs, which contributes 49-65% of the overall NO3- pool at this site. The results obtained with this method are in good agreement with source estimates based upon δ15N values in sediments and a three-dimensional ecosystem model, ERGOM. We suggest that this approach can be easily modified to determine NO3- sources in other marginal seas or larger near-coastal areas where NO3- is abundant in winter surface waters when fractionation processes are minor.
International Nuclear Information System (INIS)
A full-scope method is constructed to reveal source term uncertainties and to identify influential inputs during a severe accident at a nuclear power plant (NPP). An integrated severe accident code, MELCOR Ver. 1.8.5, is used as a tool to simulate the accident similar to that occurred at Unit 2 of the Fukushima Daiichi NPP. In order to figure out how much radioactive materials are released from the containment to the environment during the accident, Monte Carlo based uncertainty analysis is performed. Generally, in order to evaluate the influence of uncertain inputs on the output, a large number of code runs are required in the global sensitivity analysis. To avoid the laborious computational cost for the global sensitivity analysis via MELCOR, a surrogate stochastic model is built using a Bayesian nonparametric approach, Dirichlet process. Probability distributions derived from uncertainty analysis using MELCOR and the stochastic model show good agreement. The appropriateness of the stochastic model is cross-validated through the comparison with MELCOR results. The importance measure of uncertain input variables are calculated according to their influences on the uncertainty distribution as first-order effect and total effect. The validity of the present methodology is demonstrated through an example with three uncertain input variables. - Highlights: • A method of source term uncertainty and sensitivity analysis is proposed. • Source term in Fukushima Daiichi NPP severe accident is demonstrated. • Uncertainty distributions of source terms show non-standard shapes. • A surrogate model for integrated code is constructed by using Dirichlet process. • Importance ranking of influential input variables is obtained
Source Separation and Composting of Organic Municipal Solid Waste.
Gould, Mark; And Others
1992-01-01
Describes a variety of composting techniques that may be utilized in a municipal level solid waste management program. Suggests how composting system designers should determine the amount and type of organics in the waste stream, evaluate separation approaches and assess collection techniques. Outlines the advantages of mixed waste composting and…
Directory of Open Access Journals (Sweden)
Lin Wang
2010-01-01
Full Text Available Frequency-domain blind source separation (BSS performs poorly in high reverberation because the independence assumption collapses at each frequency bins when the number of bins increases. To improve the separation result, this paper proposes a method which combines two techniques by using beamforming as a preprocessor of blind source separation. With the sound source locations supposed to be known, the mixed signals are dereverberated and enhanced by beamforming; then the beamformed signals are further separated by blind source separation. To implement the proposed method, a superdirective fixed beamformer is designed for beamforming, and an interfrequency dependence-based permutation alignment scheme is presented for frequency-domain blind source separation. With beamforming shortening mixing filters and reducing noise before blind source separation, the combined method works better in reverberation. The performance of the proposed method is investigated by separating up to 4 sources in different environments with reverberation time from 100 ms to 700 ms. Simulation results verify the outperformance of the proposed method over using beamforming or blind source separation alone. Analysis demonstrates that the proposed method is computationally efficient and appropriate for real-time processing.
Improving the perceptual quality of single-channel blind audio source separation.
Stokes, Tobias W.
2015-01-01
Given a mixture of audio sources, a blind audio source separation (BASS) tool is required to extract audio relating to one specific source whilst attenuating that related to all others. This thesis answers the question “How can the perceptual quality of BASS be improved for broadcasting applications?” The most common source separation scenario, particularly in the field of broadcasting, is single channel, and this is particularly challenging as a limited set of cues are available. Broadcas...
Sparse Reverberant Audio Source Separation via Reweighted Analysis
Arberet, Simon; Vandergheynst, Pierre; Carrillo, Rafael; Thiran, Jean-Philippe; Wiaux, Yves
2013-01-01
We propose a novel algorithm for source signals estimation from an underdetermined convolutive mixture assuming known mixing filters. Most of the state-of-the-art methods are dealing with anechoic or short reverberant mixture, assuming a synthesis sparse prior in the time-frequency domain and a narrowband approximation of the convolutive mixing process. In this paper, we address the source estimation of convolutive mixtures with a new algorithm based on i) an analysis sparse prior, ii) a rewe...
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)
DEFF Research Database (Denmark)
Stahlhut, Carsten; Mørup, Morten; Winther, Ole; Hansen, Lars Kai
In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and...
Single-channel source separation using non-negative matrix factorization
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard
which a number of methods for single-channel source separation based on non-negative matrix factorization are presented. In the papers, the methods are applied to separating audio signals such as speech and musical instruments and separating different types of tissue in chemical shift imaging....
From Binaural to Multichannel Blind Source Separation using Fixed Beamforming with HRTFs
Maazaoui, Mounira; Grenier, Yves; Abed-Meraim, Karim
2012-01-01
In this article, we are interested in the problem of blind source separation (BSS) for the robot audition, we study the performance of blind source separation with a varying number of sensors in a microphone array placed in the head of an infant size dummy. We propose a two stage blind source separation algorithm based on a fixed beamforming preprocessing using the head related transfer functions (HRTF) of the dummy and a separation algorithm using a sparsity criterion. We show that in the ca...
30 CFR 57.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... NONMETAL MINES Explosives Electric Blasting-Surface and Underground § 57.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall...
30 CFR 56.6404 - Separation of blasting circuits from power source.
2010-07-01
... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... MINES Explosives Electric Blasting § 56.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be locked in the open position...
Sound fields separation and reconstruction of irregularly shaped sources
Totaro, N.; Vigoureux, D.; Leclère, Q.; Lagneaux, J.; Guyader, J. L.
2015-02-01
Nowadays, the need of source identification methods is still growing and application cases are more and more complex. As a consequence, it is necessary to develop methods allowing us to reconstruct sound fields on irregularly shaped sources in reverberant or confined acoustic environment. The inverse Patch Transfer Functions (iPTF) method is suitable to achieve these objectives. Indeed, as the iPTF method is based on Green's identity and double measurements of pressure and particle velocity on a surface surrounding the source, it is independent of the acoustic environment. In addition, the finite element solver used to compute the patch transfer functions permits us to handle sources with 3D irregular shapes. In the present paper, two experimental applications on a flat plate and an oil pan have been carried out to show the performances of the method on real applications. As for all ill-posed problem, it is shown that the crucial point of this method is the choice of the parameter of the Tikhonov regularization, one of the most widely used in the literature. The classical L-curve strategy sometimes fails to choose the best solution. This issue is clearly explained and an adapted strategy combining L-curve and acoustic power conservation is proposed. The efficiency of this strategy is demonstrated on both applications and compared to results obtained with Generalized Cross Validation (GCV) technique.
Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan;
2005-01-01
A limitation in many source separation tasks is that the number of source signals has to be known in advance. Further, in order to achieve good performance, the number of sources cannot exceed the number of sensors. In many real-world applications these limitations are too strict. We propose a...... novel method for over-complete blind source separation. Two powerful source separation techniques have been combined, independent component analysis and binary time-frequency masking. Hereby, it is possible to iteratively extract each speech signal from the mixture. By using merely two microphones we...... can separate up to six mixed speech signals under anechoic conditions. The number of source signals is not assumed to be known in advance. It is also possible to maintain the extracted signals as stereo signals...
Quantum Rate Distortion, Reverse Shannon Theorems, and Source-Channel Separation
Datta, Nilanjana; Hsieh, Min-Hsiu; Wilde, Mark M.
2013-01-01
We derive quantum counterparts of two key theorems of classical information theory, namely, the rate distortion theorem and the source-channel separation theorem. The rate-distortion theorem gives the ultimate limits on lossy data compression, and the source-channel separation theorem implies that a two-stage protocol consisting of compression and channel coding is optimal for transmitting a memoryless source over a memoryless channel. In spite of their importance in the classical domain, the...
Student support and perceptions of urine source separation in a university community.
Ishii, Stephanie K L; Boyer, Treavor H
2016-09-01
Urine source separation, i.e., the collection and treatment of human urine as a separate waste stream, has the potential to improve many aspects of water resource management and wastewater treatment. However, social considerations must be taken into consideration for successful implementation of this alternative wastewater system. This work evaluated the perceptions of urine source separation held by students living on-campus at a major university in the Southeastern region of the United States. Perceptions were evaluated in the context of the Theory of Planned Behavior. The survey population represents one group within a community type (universities) that is expected to be an excellent testbed for urine source separation. Overall, respondents reported high levels of support for urine source separation after watching a video on expected benefits and risks, e.g., 84% indicated that they would vote in favor of urine source separation in residence halls. Support was less apparent when measured by willingness to pay, as 33% of respondents were unwilling to pay for the implementation of urine source separation and 40% were only willing to pay $1 to $10 per semester. Water conservation was largely identified as the most important benefit of urine source separation and there was little concern reported about the use of urine-based fertilizers. Statistical analyses showed that one's environmental attitude, environmental behavior, perceptions of support within the university community, and belief that student opinions have an impact on university decision makers were significantly correlated with one's support for urine source separation. This work helps identify community characteristics that lend themselves to acceptance of urine source separation, such as those related to environmental attitudes/behaviors and perceptions of behavioral control and subjective norm. Critical aspects of these alternative wastewater systems that require attention in order to foster public
Objective Bayesian analysis of counting experiments with correlated sources of background
Casadei, Diego
2015-01-01
Searches for faint signals in counting experiments are often encountered in particle physics and astrophysics, as well as in other fields. Many problems can be reduced to the case of a model with independent and Poisson distributed signal and background. Often several background contributions are present at the same time, possibly correlated. We provide the analytic solution of the statistical inference problem of estimating the signal in the presence of multiple backgrounds, in the framework of objective Bayes statistics. The model can be written in the form of a product of a single Poisson distribution with a multinomial distribution. The first is related to the total number of events, whereas the latter describes the fraction of events coming from each individual source. Correlations among different backgrounds can be included in the inference problem by a suitable choice of the priors.
Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm
Directory of Open Access Journals (Sweden)
Lei Chen
2014-01-01
Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.
Empirical Study on Factors Influencing Residents' Behavior of Separating Household Wastes at Source
Institute of Scientific and Technical Information of China (English)
Qu Ying; Zhu Qinghua; Murray Haight
2007-01-01
Source separation is the basic premise for making effective use of household wastes. In eight cities of China, however, several pilot projects of source separation finally failed because of the poor participation rate of residents. In order to solve this problem, identifying those factors that influence residents' behavior of source separation becomes crucial. By means of questionnaire survey, we conducted descriptive analysis and exploratory factor analysis. The results show that trouble-feeling, moral notion, environment protection, public education, environment value and knowledge deficiency are the main factors that play an important role for residents in deciding to separate their household wastes. Also, according to the contribution percentage of the six main factors to the total behavior of source separation, their influencing power is analyzed, which will provide suggestions on household waste management for policy makers and decision makers in China.
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Source Separation and Higher-Order Causal Analysis of MEG and EEG
Zhang, Kun
2012-01-01
Separation of the sources and analysis of their connectivity have been an important topic in EEG/MEG analysis. To solve this problem in an automatic manner, we propose a two-layer model, in which the sources are conditionally uncorrelated from each other, but not independent; the dependence is caused by the causality in their time-varying variances (envelopes). The model is identified in two steps. We first propose a new source separation technique which takes into account the autocorrelations (which may be time-varying) and time-varying variances of the sources. The causality in the envelopes is then discovered by exploiting a special kind of multivariate GARCH (generalized autoregressive conditional heteroscedasticity) model. The resulting causal diagram gives the effective connectivity between the separated sources; in our experimental results on MEG data, sources with similar functions are grouped together, with negative influences between groups, and the groups are connected via some interesting sources.
Monaural separation of dependent audio sources based on a generalized Wiener filter
DEFF Research Database (Denmark)
Ma, Guilin; Agerkvist, Finn T.; Luther, J.B.
2007-01-01
) coefficients of the dependent sources is modeled by complex Gaussian mixture models in the frequency domain from samples of individual sources to capture the properties of the sources and their correlation. During the second stage, the mixture is separated through a generalized Wiener filter, which takes...
Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin
2015-05-01
Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. PMID:25819930
Lockhart, K.; Harter, T.; Grote, M.; Young, M. B.; Eppich, G.; Deinhart, A.; Wimpenny, J.; Yin, Q. Z.
2014-12-01
Groundwater quality is a concern in alluvial aquifers underlying agricultural areas worldwide, an example of which is the San Joaquin Valley, California. Nitrate from land applied fertilizers or from animal waste can leach to groundwater and contaminate drinking water resources. Dairy manure and synthetic fertilizers are the major sources of nitrate in groundwater in the San Joaquin Valley, however, septic waste can be a major source in some areas. As in other such regions around the world, the rural population in the San Joaquin Valley relies almost exclusively on shallow domestic wells (≤150 m deep), of which many have been affected by nitrate. Consumption of water containing nitrate above the drinking water limit has been linked to major health effects including low blood oxygen in infants and certain cancers. Knowledge of the proportion of each of the three main nitrate sources (manure, synthetic fertilizer, and septic waste) contributing to individual well nitrate can aid future regulatory decisions. Nitrogen, oxygen, and boron isotopes can be used as tracers to differentiate between the three main nitrate sources. Mixing models quantify the proportional contributions of sources to a mixture by using the concentration of conservative tracers within each source as a source signature. Deterministic mixing models are common, but do not allow for variability in the tracer source concentration or overlap of tracer concentrations between sources. Bayesian statistics used in conjunction with mixing models can incorporate variability in the source signature. We developed a Bayesian mixing model on a pilot network of 32 private domestic wells in the San Joaquin Valley for which nitrate as well as nitrogen, oxygen, and boron isotopes were measured. Probability distributions for nitrogen, oxygen, and boron isotope source signatures for manure, fertilizer, and septic waste were compiled from the literature and from a previous groundwater monitoring project on several
Source Separation and Clustering of Phase-Locked Subspaces: Derivations and Proofs
Almeida, Miguel; Schleimer, Jan-Hendrik; Bioucas-Dias, José; Vigário, Ricardo
2011-01-01
Due to space limitations, our submission "Source Separation and Clustering of Phase-Locked Subspaces", accepted for publication on the IEEE Transactions on Neural Networks in 2011, presented some results without proof. Those proofs are provided in this paper.
Fate of pharmaceuticals in full-scale source separated sanitation system
Butkovskyi, A.; Hernandez Leal, L.; Rijnaarts, H.H.M.; Zeeman, G.
2015-01-01
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen
Phase recovery in NMF for audio source separation: an insightful benchmark
Magron, Paul; Badeau, Roland; David, Bertrand
2016-01-01
Nonnegative Matrix Factorization (NMF) is a powerful tool for decomposing mixtures of audio signals in the Time-Frequency (TF) domain. In applications such as source separation, the phase recovery for each extracted component is a major issue since it often leads to audible artifacts. In this paper, we present a methodology for evaluating various NMF-based source separation techniques involving phase reconstruction. For each model considered, a comparison between two approaches (blind separat...
Semi-Blind Source Separation in a Multi-User Transmission System with Interference Alignment
FADLALLAH, Yasser; AISSA EL BEY, Abdeldjalil; Abed-Meraim, Karim; AMIS CAVALEC, Karine; Pyndiah, Ramesh
2013-01-01
In this paper we address the decoding problem in the K-user MIMO interference channel assuming an interference alignment (IA) design. We aim to decode robustly the desired signal without having a full Channel State Information (CSI) (i.e. precoders knowledge) at the receivers. We show the equivalency between the IA model and the Semi-Blind Source Separation model (SBSS). Then, we prove that this equivalence allows the use of techniques employed in source separation for extracting the desired ...
A cost evaluation method for transferring municipalities to solid waste source-separated system.
Lavee, Doron; Nardiya, Shlomit
2013-05-01
Most of Israel's waste is disposed in landfills, threatening scarce land resources and posing environmental and health risks. The aim of this study is to estimate the expected costs of transferring municipalities to solid waste source separation in Israel, aimed at reducing the amount of waste directed to landfills and increasing the efficiency and amount of recycled waste. Information on the expected costs of operating a solid waste source separation system was gathered from 47 municipalities and compiled onto a database, taking into consideration various factors such as costs of equipment, construction adjustments and waste collection and disposal. This database may serve as a model for estimating the costs of entering the waste source separation system for any municipality in Israel, while taking into consideration its specific characteristics, such as size and region. The model was used in Israel for determining municipalities' eligibility to receive a governmental grant for entering an accelerated process of solid waste source separation. This study displays a user-friendly and simple operational tool for assessing municipalities' costs of entering a process of waste source separation, providing policy makers a powerful tool for diverting funds effectively in promoting solid waste source separation. PMID:23465315
A Modified Infomax ICA Algorithm for fMRI Data Source Separation
Directory of Open Access Journals (Sweden)
Amir A. Khaliq
2013-05-01
Full Text Available This study presents a modified infomax model of Independent Component Analysis (ICA for the source separation problem of fMRI data. Functional MRI data is processed by different blind source separation techniques including Independent Component Analysis (ICA. ICA is a statistical decomposition method used for multivariate data source separation. ICA algorithm is based on independence of extracted sources for which different techniques are used like kurtosis, negentropy, information maximization etc. The infomax method of ICA extracts unknown sources from a number of mixtures by maximizing the negentropy thus ensuring independence. In this proposed modified infomax model a higher order contrast function is used which results in fast convergence and accuracy. The Proposed algorithm is applied to general simulated signals and simulated fMRI data. Comparison of correlation results of the proposed algorithm with the conventional infomax algorithm shows better performance.
Separation of Correlated Astrophysical Sources Using Multiple-Lag Data Covariance Matrices
Directory of Open Access Journals (Sweden)
Baccigalupi C
2005-01-01
Full Text Available This paper proposes a new strategy to separate astrophysical sources that are mutually correlated. This strategy is based on second-order statistics and exploits prior information about the possible structure of the mixing matrix. Unlike ICA blind separation approaches, where the sources are assumed mutually independent and no prior knowledge is assumed about the mixing matrix, our strategy allows the independence assumption to be relaxed and performs the separation of even significantly correlated sources. Besides the mixing matrix, our strategy is also capable to evaluate the source covariance functions at several lags. Moreover, once the mixing parameters have been identified, a simple deconvolution can be used to estimate the probability density functions of the source processes. To benchmark our algorithm, we used a database that simulates the one expected from the instruments that will operate onboard ESA's Planck Surveyor Satellite to measure the CMB anisotropies all over the celestial sphere.
Bamber, J. L.; Schoen, N.; Zammit-Mangion, A.; Rougier, J.; Flament, T.; Luthcke, S. B.; Petrie, E. J.; Rémy, F.
2013-12-01
There remains considerable inconsistency between different methods and approaches for determining ice mass trends for Antarctica from satellite observations. There are three approaches that can provide near global coverage for mass trends: altimetry, gravimetry and mass budget calculations. All three approaches suffer from a source separation problem where other geophysical processes limit the capability of the method to resolve the origin and magnitude of a mass change. A fourth approach, GPS vertical motion, provides localised estimates of mass change due to elastic uplift and an indirect estimate of GIA. Each approach has different source separation issues and different spatio-temporal error characteristics. In principle, it should be possible to combine the data and process covariances to minimize the uncertainty in the solution and to produce robust, posterior errors for the trends. In practice, this is a challenging problem in statistics because of the large number of degrees of freedom, the variable spatial and temporal sampling between the different observations and the fact that some processes remain under-sampled, such as firn compaction. Here, we present a novel solution to this problem using the latest methods in statistical modelling of spatio-temporal processes. We use Bayesian hierarchical modelling and employ stochastic partial differential equations to capture our physical understanding of the key processes that influence our observations. Due to the huge number of observations involved (> 10^8) methods are required to reduce the dimensionality of the problem and care is required in treatment of the observations as they are not independent. Here, we focus mainly on the results rather than the full suite of methods and we present time evolving fields of surface mass balance, ice dynamic-driven mass loss, and firn compaction for the period 2003-2009, derived from a combination of ICESat, ENVISAT, GRACE, InSAR, GPS and regional climate model output
Y. Yokoo
2014-01-01
This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring ...
Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study
Directory of Open Access Journals (Sweden)
Cees Buisman
2013-07-01
Full Text Available Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1 centralized; (2 centralized with source-separation of urine; (3 source-separation of black water, kitchen refuse and grey water; and (4 source-separation of urine, feces, kitchen refuse and grey water. The highest primary energy consumption of 914 MJ/capita(cap/year was attained within the centralized sanitation concept, and the lowest primary energy consumption of 437 MJ/cap/year was attained within source-separation of urine, feces, kitchen refuse and grey water. Grey water bio-flocculation and subsequent grey water sludge co-digestion decreased the primary energy consumption, but was not energetically favorable to couple with grey water effluent reuse. Source-separation of urine improved the energy balance, nutrient recovery and effluent quality, but required larger land area and higher chemical use in the centralized concept.
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
The single staged ECR source at the TRIUMF isotope separator TISOL
International Nuclear Information System (INIS)
With its installation at the isotope separator TISOL the single staged ECR source has now become operational in delivering radioactive species extracted from the production target which is bombarded by 500 MeV protons. Among the radioactive species detected so far are He, C, N, Ne, Cl, Ar, Kr and Xe. The dependency of the ion currents/efficiencies on several source parameters is discussed as well as the technical difficulties in connecting ECR sources to on-line isotope separators. (Author) (5 refs., tab., 6 figs.)
Applying the Background-Source separation algorithm to Chandra Deep Field South data
Guglielmetti, F; Fischer, R; Rosati, P; Tozzi, P
2012-01-01
A probabilistic two-component mixture model allows one to separate the diffuse background from the celestial sources within a one-step algorithm without data censoring. The background is modeled with a thin-plate spline combined with the satellite's exposure time. Source probability maps are created in a multi-resolution analysis for revealing faint and extended sources. All detected sources are automatically parametrized to produce a list of source positions, fluxes and morphological parameters. The present analysis is applied to the Chandra Deep Field South 2 Ms public released data. Within its 1.884 ks of exposure time and its angular resolution (0.984 arcsec), the Chandra Deep Field South data are particularly suited for testing the Background-Source separation algorithm.
Reverberant Audio Source Separation via Sparse and Low-Rank Modeling
Arberet, Simon; Vandergheynst, Pierre
2013-01-01
The performance of audio source separation from underde- termined convolutive mixture assuming known mixing filters can be significantly improved by using an analysis sparse prior optimized by a reweighting l1 scheme and a wideband data- fidelity term, as demonstrated by a recent article. In this letter, we show that the performance can be improved even more significantly by exploiting a low-rank prior on the source spectrograms. We present a new algorithm to estimate the sources based on i) ...
Non-Stationary Brain Source Separation for Multi-Class Motor Imagery
Gouy-Pailler, Cedric; Congedo, Marco; Brunner, Clemens; Jutten, Christian; Pfurtscheller, Gert
2010-01-01
International audience This article describes a method to recover taskrelated brain sources in the context of multi-class Brain- Computer Interfaces (BCIs) based on non-invasive electroencephalography (EEG). We extend the method Joint Approximate Diagonalization (JAD) for spatial filtering using a maximum likelihood framework. This generic formulation (1) bridges the gap between the Common Spatial Patterns (CSP) and Blind Source Separation (BSS) of non-stationary sources, and (2) leads to ...
ICAR, a tool for Blind Source Separation using Fourth Order Statistics only
Albera, Laurent; Férreol, Anne; Chevalier, Pascal; Comon, Pierre
2005-01-01
The problem of blind separation of overdetermined mixtures of sources, that is, with fewer sources than (or as many sources as) sensors, is addressed in this paper. A new method, named ICAR (Independent Component Analysis using Redundancies in the quadricovariance), is proposed in order to process complex data. This method, without any whitening operation, only exploits some redundancies of a particular quadricovariance matrix of the data. Computer simulations demonstrate that ICAR offers in ...
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as well as the sources with their respective uncertainties. Background estimation and source detection is achieved in a single algorithm. A large variety of source morphologies is revealed. The technique is applied in the X-ray part of the electromagnetic spectrum on ROSAT and Chandra data sets and it is under a feasibility study for the forthcoming eROSITA mission.
Municipal solid waste source-separated collection in China: A comparative analysis
International Nuclear Information System (INIS)
A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.
Rifai Chai; Naik, Ganesh R; Tran, Yvonne; Sai Ho Ling; Craig, Ashley; Nguyen, Hung T
2015-08-01
An electroencephalography (EEG)-based counter measure device could be used for fatigue detection during driving. This paper explores the classification of fatigue and alert states using power spectral density (PSD) as a feature extractor and fuzzy swarm based-artificial neural network (ANN) as a classifier. An independent component analysis of entropy rate bound minimization (ICA-ERBM) is investigated as a novel source separation technique for fatigue classification using EEG analysis. A comparison of the classification accuracy of source separator versus no source separator is presented. Classification performance based on 43 participants without the inclusion of the source separator resulted in an overall sensitivity of 71.67%, a specificity of 75.63% and an accuracy of 73.65%. However, these results were improved after the inclusion of a source separator module, resulting in an overall sensitivity of 78.16%, a specificity of 79.60% and an accuracy of 78.88% (p <; 0.05). PMID:26736312
Bao, Le; Raftery, Adrian E.; Reddy, Amala
2015-01-01
In most countries in the world outside of sub-Saharan Africa, HIV is largely concentrated in sub-populations whose behavior puts them at higher risk of contracting and transmitting HIV, such as people who inject drugs, sex workers and men who have sex with men. Estimating the size of these sub-populations is important for assessing overall HIV prevalence and designing effective interventions. We present a Bayesian hierarchical model for estimating the sizes of local and national HIV key affec...
Gran, Bjørn Axel
2002-01-01
The objective of the research has been to investigate the possibility to transfer the requirements of a software safety standard into Bayesian belief networks (BBNs). The BBN methodology has mainly been developed and applied in the AI society, but more recently it has been proposed to apply it to the assessment of programmable systems. The relation to AI application is relevant in the sense that the method reflects the way of an assessor's thinking during the assessment process. Conceptually,...
Institute of Scientific and Technical Information of China (English)
Yunhan Luo; Houxin Cui; Xiaoyu Gu; Rong Liu; Kexin Xu
2005-01-01
Based on analysis of the relation between mean penetration depth and source-detector separation in a threelayer model with the method of Monte-Carlo simulation, an optimal source-detector separation is derived from the mean penetration depth referring to monitoring the change of chromophores concentration of the sandwiched layer. In order to verify the separation, we perform Monte-Carlo simulations with varied absorption coefficient of the sandwiched layer. All these diffuse reflectances are used to construct a calibration model with the method of partial least square (PLS). High correlation coefficients and low root mean square error of prediction (RMSEP) at the optimal separation have confirmed correctness of the selection. This technique is expected to show light on noninvasive diagnosis of near-infrared spectroscopy.
Life cycle assessment of grain production using source-separated human urine and mineral fertiliser
Tidåker, Pernilla
2003-01-01
Source-separation of human urine is one promising technique for closing the nutrient cycle, reducing nutrient discharge and increasing energy efficiency. Separated urine can be used as a valuable fertiliser in agriculture, replacing mineral fertiliser. However, a proper handling of the urine at farm level is crucial for the environmental performance of the whole system. This study started from an agricultural point of view, demonstrating how grain production systems using human urine might be...
Subband-based Single-channel Source Separation of Instantaneous Audio Mixtures
Taghia, Jalil; Doostari, Mohammad Ali
2009-01-01
In this paper, a new algorithm is developed to separate the audio sources from a single instantaneous mixture. The algorithm is based on subband decomposition and uses a hybrid system of Empirical Mode Decomposition (EMD) and Principle Component Analysis (PCA) to construct artificial observations from the single mixture. In the separation stage of algorithm, we use Independent Component Analysis (ICA) to find independent components. At first the observed mixture is divided into a finite numbe...
DEFF Research Database (Denmark)
Larsen, Anna Warberg; Astrup, Thomas
2011-01-01
variations between emission factors for different incinerators, but the background for these variations has not been thoroughly examined. One important reason may be variations in collection of recyclable materials as source separation alters the composition of the residual waste incinerated. The objective......CO2-loads from combustible waste are important inputs for national CO2 inventories and life-cycle assessments (LCA). CO2 emissions from waste incinerators are often expressed by emission factors in kg fossil CO2 emitted per GJ energy content of the waste. Various studies have shown considerable...... of this study was to quantify the importance of source separation for determination of emission factors for incineration of residual household waste. This was done by mimicking various source separation scenarios and based on waste composition data calculating resulting emission factors for residual...
Blind source separation of ship-radiated noise based on generalized Gaussian model
Institute of Scientific and Technical Information of China (English)
Kong Wei; Yang Bin
2006-01-01
When the distribution of the sources cannot be estimated accurately, the ICA algorithms failed to separate the mixtures blindly. The generalized Gaussian model (GGM) is presented in ICA algorithm since it can model nonGaussian statistical structure of different source signals easily. By inferring only one parameter, a wide class of statistical distributions can be characterized. By using maximum likelihood (ML) approach and natural gradient descent, the learning rules of blind source separation (BSS) based on GGM are presented. The experiment of the ship-radiated noise demonstrates that the GGM can model the distributions of the ship-radiated noise and sea noise efficiently, and the learning rules based on GGM gives more successful separation results after comparing it with several conventional methods such as high order cumulants and Gaussian mixture density function.
Semi-blind Source Separation Using Head-Related Transfer Functions
DEFF Research Database (Denmark)
Pedersen, Michael Syskind; Hansen, Lars Kai; Kjems, Ulrik; Rasmussen, Karsten Bo
An online blind source separation algorithm which is a special case of the geometric algorithm by Parra and Fancourt has been implemented for the purpose of separating sounds recorded at microphones placed at each side of the head. By using the assumption that the position of the two sounds are k...... the separation with approximately 1 dB compared to when free-field is assumed. This indicates that the permutation ambiguity is solved more accurate compared to when free-field is assumed....
FPGA-based real-time blind source separation with principal component analysis
Wilson, Matthew; Meyer-Baese, Uwe
2015-05-01
Principal component analysis (PCA) is a popular technique in reducing the dimension of a large data set so that more informed conclusions can be made about the relationship between the values in the data set. Blind source separation (BSS) is one of the many applications of PCA, where it is used to separate linearly mixed signals into their source signals. This project attempts to implement a BSS system in hardware. Due to unique characteristics of hardware implementation, the Generalized Hebbian Algorithm (GHA), a learning network model, is used. The FPGA used to compile and test the system is the Altera Cyclone III EP3C120F780I7.
A treatment of EEG data by underdetermined blind source separation for motor imagery classification
Czech Academy of Sciences Publication Activity Database
Koldovský, Zbyněk; Phan, A. H.; Tichavský, Petr; Cichocki, A.
Bucharest: EURASIP, 2012, s. 1484-1488. ISBN 978-1-4673-1068-0. ISSN 2076-1465. [20th European Signal Processing Conference (EUSIPCO 2012). Bukurešť (RO), 27.08.2012-31.08.2012] Grant ostatní: GA ČR(CZ) GAP103/11/1947 Institutional support: RVO:67985556 Keywords : electroencephalogram * brain-computer Interface * underdetermined blind source separation Subject RIV: FH - Neurology http://library.utia.cas.cz/separaty/2012/SI/tichavsky-a treatment of eeg data by underdetermined blind source separation for motor imagery classification.pdf
Institute of Scientific and Technical Information of China (English)
黄晋英; 潘宏侠; 毕世华; 杨喜旺
2008-01-01
Blind source separation (BBS) technology was applied to vibration signal processing of gearbox for separating different fault vibration sources and enhancing fault information. An improved BSS algorithm based on particle swarm optimization (PSO) was proposed. It can change the traditional fault-enhancing thought based on de-noising. And it can also solve the practical difficult problem of fault location and low fault diagnosis rate in early stage. It was applied to the vibration signal of gearbox under three working states. The result proves that the BSS greatly enhances fault information and supplies technological method for diagnosis of weak fault.
Blind Source Separation with Conjugate Gradient Algorithm and Kurtosis Maximization Criterion
Directory of Open Access Journals (Sweden)
Sanjeev N Jain
2016-02-01
Full Text Available Blind source separation (BSS is a technique for estimating individual source components from their mixtures at multiple sensors. It is called blind because any additional other information will not be used besides the mixtures. Recently, blind source separation has received attention because of its potential applications in signal processing such as in speech recognition systems, telecommunications and medical signal processing. Blind source separation of super and sub-Gaussian Signal is proposed utilizing conjugate gradient algorithm and kurtosis maximization criteria. In our previous paper, ABC algorithm was utilized to blind source separation and here, we improve the technique with changes in fitness function and scout bee phase. Fitness function is improved with the use of kurtosis maximization criterion and scout bee phase is improved with use of conjugate gradient algorithm. The evaluation metrics used for performance evaluation are fitness function values and distance values. Comparative analysis is also carried out by comparing our proposed technique to other prominent techniques. The technique achieved average distance of 38.39, average fitness value of 6.94, average Gaussian distance of 58.60 and average Gaussian fitness as 5.02. The technique attained lowest average distance value among all techniques and good values for all other evaluation metrics which shows the effectiveness of the proposed technique.
Separation of beam and electrons in the spallation neutron source H- ion source
International Nuclear Information System (INIS)
The Spallation Neutron Source (SNS) requires an ion source producing an H- beam with a peak current of 35 mA at a 6.2% duty factor. For the design of this ion source, extracted electrons must be transported and dumped without adversely affecting the H- beam optics. Two issues are considered: (1) electron containment transport and controlled removal; and (2) first-order H- beam steering. For electron containment, various magnetic, geometric and electrode biasing configurations are analyzed. A kinetic description for the negative ions and electrons is employed with self-consistent fields obtained from a steady-state solution to Poisson's equation. Guiding center electron trajectories are used when the gyroradius is sufficiently small. The magnetic fields used to control the transport of the electrons and the asymmetric sheath produced by the gyrating electrons steer the ion beam. Scenarios for correcting this steering by split acceleration and focusing electrodes will be considered in some detail
Multichannel audio signal source separation based on an Interchannel Loudness Vector Sum
Park, Taejin; Lee, Taejin
2015-01-01
In this paper, a Blind Source Separation (BSS) algorithm for multichannel audio contents is proposed. Unlike common BSS algorithms targeting stereo audio contents or microphone array signals, our technique is targeted at multichannel audio such as 5.1 and 7.1ch audio. Since most multichannel audio object sources are panned using the Inter-channel Loudness Difference (ILD), we employ the ILVS (Inter-channel Loudness Vector Sum) concept to cluster common signals (such as background music) from ...
On-line isotope separation. Tests for targets and ion sources compatibility
International Nuclear Information System (INIS)
We have performed a compilation of the influence of various parameters on suitable targets (composition, structure and nuclear constraint) for fission and spallation reactions induced by charged particles. In that case, targets are generally located near or inside the ionization chamber. A survey of typical ions sources and separators particularly used with heavy ion beams is given. These sources are often feeded either by a helium jet transport system or by a catcher foil
Carabias Orti, Julio J; Cobos, M??ximo; Vera Candeas, Pedro; Rodr??guez Serrano, Francisco J
2013-01-01
Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge am...
Ahuja, Chaitanya; Nathwani, Karan; Rajesh M. Hegde
2014-01-01
Conventional NMF methods for source separation factorize the matrix of spectral magnitudes. Spectral Phase is not included in the decomposition process of these methods. However, phase of the speech mixture is generally used in reconstructing the target speech signal. This results in undesired traces of interfering sources in the target signal. In this paper the spectral phase is incorporated in the decomposition process itself. Additionally, the complex matrix factorization problem is reduce...
Directory of Open Access Journals (Sweden)
Takuya Isomura
2015-12-01
Full Text Available Blind source separation is the computation underlying the cocktail party effect--a partygoer can distinguish a particular talker's voice from the ambient noise. Early studies indicated that the brain might use blind source separation as a signal processing strategy for sensory perception and numerous mathematical models have been proposed; however, it remains unclear how the neural networks extract particular sources from a complex mixture of inputs. We discovered that neurons in cultures of dissociated rat cortical cells could learn to represent particular sources while filtering out other signals. Specifically, the distinct classes of neurons in the culture learned to respond to the distinct sources after repeating training stimulation. Moreover, the neural network structures changed to reduce free energy, as predicted by the free-energy principle, a candidate unified theory of learning and memory, and by Jaynes' principle of maximum entropy. This implicit learning can only be explained by some form of Hebbian plasticity. These results are the first in vitro (as opposed to in silico demonstration of neural networks performing blind source separation, and the first formal demonstration of neuronal self-organization under the free energy principle.
Yuan, Yalin; Yabe, Mitsuyasu
2014-01-01
A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly. PMID:25546279
Directory of Open Access Journals (Sweden)
Yalin Yuan
2014-12-01
Full Text Available A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.
Co-Parenting: Sharing Your Child Equally. A Source Book for the Separated or Divorced Family.
Galper, Miriam
This source book introduces perspectives and skills which can contribute to successful "co-parenting" (joint custody, joint parenting, co-custody or shared custody) of preadolescent children after parents are separated or divorced. Chapter One introduces the concept of co-parenting. Chapter Two advances an approach to developing flexible…
Resource recovery from source separated domestic waste(water) streams; Full scale results
Zeeman, G.; Kujawa, K.
2011-01-01
A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation
Micropollutant removal in an algal treatment system fed with source separated wastewater streams
Wilt, de H.A.; Butkovskyi, A.; Tuantet, K.; Hernandez Leal, L.; Fernandes, T.; Langenhoff, A.A.M.; Zeeman, G.
2016-01-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceu
Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions
Directory of Open Access Journals (Sweden)
Zhang Yimin
2006-01-01
Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.
ENVIRONMENTAL EFFICIENCY, SEPARABILITY AND ABATEMENT COSTS OF NON-POINT SOURCE POLLUTION
Wossink, Ada; Denaux, Zulal Sogutlu
2002-01-01
This paper presents a new framework for analyzing abatement costs of nonpoint-source pollution. Unlike previous studies, this framework treats production and pollution as non-separable and also recognizes that production inefficiency is a fundamental cause of pollution. The implications of this approach are illustrated using an empirical analysis for cotton producers.
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
Yang Junan; Li Bin; Zhuang Zhenquan
2003-01-01
This letter proposes two algorithms: a novel Quantum Genetic Algorithm (QGA)based on the improvement of Han's Genetic Quantum Algorithm (GQA) and a new Blind Source Separation (BSS) method based on QGA and Independent Component Analysis (ICA). The simulation result shows that the efficiency of the new BSS method is obviously higher than that of the Conventional Genetic Algorithm (CGA).
Blind separation of sources in nonlinear convolved mixture based on a novel network
Institute of Scientific and Technical Information of China (English)
胡英; 杨杰; 沈利
2004-01-01
Blind separation of independent sources from their nonlinear convoluted mixtures is a more realistic problem than from linear ones. A solution to this problem based on the Entropy Maximization principle is presented. First we propose a novel two-layer network as the de-mixing system to separate sources in nonlinear convolved mixture. In output layer of our network we use feedback network architecture to cope with convoluted mixtures. Then we derive learning algorithms for the two-layer network by maximizing the information entropy. Based on the comparison of the computer simulation results, it can be concluded that the proposed algorithm has a better nonlinear convolved blind signal separation effect than the H.H. Y' s algorithm.
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2013-01-01
The environmental performance of two pretreatment technologies for source-separated organic waste was compared using life cycle assessment (LCA). An innovative pulping process where source-separated organic waste is pulped with cold water forming a volatile solid rich biopulp was compared to a more...... including a number of non-toxic and toxic impact categories were assessed. No big difference in the overall performance of the two technologies was observed. The difference for the separate life cycle steps was, however, more pronounced. More efficient material transfer in the scenario with waste pulping...... resulted in a higher biogas output and nutrient recovery and, thus, the higher impact savings related to biogas production and digest utilization. Meanwhile, larger reject amount in the scenario with screw press led to more savings obtained by utilization of the reject in this scenario....
Difficulties applying recent blind source separation techniques to EEG and MEG
Knuth, Kevin H
2015-01-01
High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...
Non-parametric Bayesian models of response function in dynamic image sequences
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
-, - (2016). ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.540, year: 2014 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf
Public opinion about the source separation of municipal solid waste in Shanghai, China.
Zhang, Weiqian; Che, Yue; Yang, Kai; Ren, Xiangyu; Tai, Jun
2012-12-01
For decades the generation of municipal solid waste (MSW) in Shanghai has been increasing. Despite the long-time efforts aimed at MSW management (MSWM), the disposal of MSW achieves poor performance. Thus, a MSW minimisation plan for Shanghai was proposed in December 2010. In this study, direct face-to-face interviews and a structured questionnaire survey were used in four different Shanghai community types. We conducted an econometric analysis of the social factors that influence the willingness to pay for MSW separation and discussed the household waste characteristics, the daily waste generation and the current treatment of kitchen wastes. The results suggested that the respondents are environmentally aware of separation, but only practise minimal separation. Negative neighbour effects, confused classification of MSW, and mixed transportation and disposal are the dominant limitations of MSW source-separated collection. Most respondents are willing to pay for MSWM. Public support is influenced by household population, income and cost. The attitudes and behaviours of citizens are important for reducing the amount of MSW disposal by 50% per capita by 2020 (relative to 2010). Concerted efforts should be taken to enlarge pilot areas. In addition, the source separation of kitchen wastes should be promoted. PMID:23045226
Directory of Open Access Journals (Sweden)
Gang Tang
2016-06-01
Full Text Available In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method.
Tang, Gang; Luo, Ganggang; Zhang, Weihua; Yang, Caijin; Wang, Huaqing
2016-01-01
In the condition monitoring of roller bearings, the measured signals are often compounded due to the unknown multi-vibration sources and complex transfer paths. Moreover, the sensors are limited in particular locations and numbers. Thus, this is a problem of underdetermined blind source separation for the vibration sources estimation, which makes it difficult to extract fault features exactly by ordinary methods in running tests. To improve the effectiveness of compound fault diagnosis in roller bearings, the present paper proposes a new method to solve the underdetermined problem and to extract fault features based on variational mode decomposition. In order to surmount the shortcomings of inadequate signals collected through limited sensors, a vibration signal is firstly decomposed into a number of band-limited intrinsic mode functions by variational mode decomposition. Then, the demodulated signal with the Hilbert transform of these multi-channel functions is used as the input matrix for independent component analysis. Finally, the compound faults are separated effectively by carrying out independent component analysis, which enables the fault features to be extracted more easily and identified more clearly. Experimental results validate the effectiveness of the proposed method in compound fault separation, and a comparison experiment shows that the proposed method has higher adaptability and practicability in separating strong noise signals than the commonly-used ensemble empirical mode decomposition method. PMID:27322268
Yan, Jun; Dong, Danan; Chen, Wen
2016-04-01
Due to the development of GNSS technology and the improvement of its positioning accuracy, observational data obtained by GNSS is widely used in Earth space geodesy and geodynamics research. Whereas the GNSS time series data of observation stations contains a plenty of information. This includes geographical space changes, deformation of the Earth, the migration of subsurface material, instantaneous deformation of the Earth, weak deformation and other blind signals. In order to decompose some instantaneous deformation underground, weak deformation and other blind signals hided in GNSS time series, we apply Independent Component Analysis (ICA) to daily station coordinate time series of the Southern California Integrated GPS Network. As ICA is based on the statistical characteristics of the observed signal. It uses non-Gaussian and independence character to process time series to obtain the source signal of the basic geophysical events. In term of the post-processing procedure of precise time-series data by GNSS, this paper examines GNSS time series using the principal component analysis (PCA) module of QOCA and ICA algorithm to separate the source signal. Then we focus on taking into account of these two signal separation technologies, PCA and ICA, for separating original signal that related geophysical disturbance changes from the observed signals. After analyzing these separation process approaches, we demonstrate that in the case of multiple factors, PCA exists ambiguity in the separation of source signals, that is the result related is not clear, and ICA will be better than PCA, which means that dealing with GNSS time series that the combination of signal source is unknown is suitable to use ICA.
Blind Source Separation for Robot Audition using Fixed Beamforming with HRTFs
Maazaoui, Mounira; Grenier, Yves; Abed-Meraim, Karim
2011-01-01
We present a two stage blind source separation (BSS) algorithm for robot audition. The algorithm is based on a beamforming preprocessing and a BSS algorithm using a sparsity separation criterion. Before the BSS step, we filter the sensors outputs by beamforming filters to reduce the reverberation and the environmental noise. As we are in a robot audition context, the manifold of the sensor array in this case is hard to model, so we use pre-measured Head Related Transfer Functions (HRTFs) to e...
Wang, Fa-Yu; Chi, Chong-Yung; Chan, Tsung-Han; Wang, Yue
2010-05-01
Although significant efforts have been made in developing nonnegative blind source separation techniques, accurate separation of positive yet dependent sources remains a challenging task. In this paper, a joint correlation function of multiple signals is proposed to reveal and confirm that the observations after nonnegative mixing would have higher joint correlation than the original unknown sources. Accordingly, a new nonnegative least-correlated component analysis (n/LCA) method is proposed to design the unmixing matrix by minimizing the joint correlation function among the estimated nonnegative sources. In addition to a closed-form solution for unmixing two mixtures of two sources, the general algorithm of n/LCA for the multisource case is developed based on an iterative volume maximization (IVM) principle and linear programming. The source identifiability and required conditions are discussed and proven. The proposed n/LCA algorithm, denoted by n/LCA-IVM, is evaluated with both simulation data and real biomedical data to demonstrate its superior performance over several existing benchmark methods. PMID:20299711
Development of the high temperature ion-source for the Grenoble electromagnetic isotope separator
International Nuclear Information System (INIS)
The production of high purity stable or radioactive isotopes (≥ 99.99 per cent) using electromagnetic separation require for equipment having a high resolving power. Besides, and in order to collect rare or short half-life isotopes, the efficiency of the ion-source must be high (η > 5 to 10 per cent). With this in view, the source built operates at high temperatures (2500-3000 C) and makes use of ionisation by electronic bombardment or of thermo-ionisation. A summary is given in the first part of this work on the essential characteristics of the isotope separator ion Sources; a diagram of the principle of the source built is then given together with its characteristics. In the second part are given the values of the resolving power and of the efficiency of the Grenoble isotope separator fitted with such a source. The resolving power measured at 10 per cent of the peak height is of the order of 200. At the first magnetic stage the efficiency is between 1 and 26 per cent for a range of elements evaporating between 200 and 3000 C. Thus equipped, the separator has for example given, at the first stage, 10 mg of 180Hf at (99.69 ± 0.1) per cent corresponding to an enrichment coefficient of 580; recently 2 mg of 150Nd at (99.996 ± 0.002) per cent corresponding to an enrichment coefficient of 4.2 x 105 has been obtained at the second stage. (author)
International Nuclear Information System (INIS)
Secondary sources of uranium include materials from which it is uneconomical to extract it as the main product using currently available technologies. Such sources are generated as co-product or by-product of processing feed materials for products other than uranium. The secondary sources can include industrial solid or liquid streams in which uranium concentration may be low, but in view of large amounts of feed-stock, the quantity of uranium recoverable could be significant. Examples include sedimentary phosphates (as well as products derived therefrom), coal ash, niobium-tantalum slag and even sea water. Monazite is a phosphatic secondary source where uranium is obtainable as a by-product of production of rare earths and thorium. The term secondary source also includes solid residues, slag, scraps etc generated as a waste product of fuel fabrication facilities. It also includes contaminated sites and equipment from conventional uranium mills that need to be decontaminated and decommissioned. In some of the secondary sources, it is possible that the concentration of uranium can be fairly high, but the processing is constrained by the complexity of the host matrix or the chemical form of uranium or presence of other elements. Recovery of uranium from secondary sources is an eco-friendly process as it serves to isolate the uranium from the environment and the future generations are thereby spared the burden of caring for such materials. It is possible that for many of the secondary sources, the concentration of uranium is below the safe limit set by currently applicable regulations. However the collective societal dose integrated over the long exposure times associated with the long half-life of uranium can be significant. In accordance with the 'as low as reasonably achievable' (ALARA) principle of radiation protection, uranium separation is desirable as a 'green' activity, This has been acknowledged in IAEA documents on the long term uranium supplies, which also
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Directory of Open Access Journals (Sweden)
Y. Yokoo
2014-09-01
Full Text Available This study compared a time source hydrograph separation method to a geographic source separation method, to assess if the two methods produced similar results. The time source separation of a hydrograph was performed using a numerical filter method and the geographic source separation was performed using an end-member mixing analysis employing hourly discharge, electric conductivity, and turbidity data. These data were collected in 2006 at the Kuroiwa monitoring station on the Abukuma River, Japan. The results of the methods corresponded well in terms of both surface flow components and inter-flow components. In terms of the baseflow component, the result of the time source separation method corresponded with the moving average of the baseflow calculated by the geographic source separation method. These results suggest that the time source separation method is not only able to estimate numerical values for the discharge components, but that the estimates are also reasonable from a geographical viewpoint in the 3000 km2 watershed discussed in this study. The consistent results obtained using the time source and geographic source separation methods demonstrate that it is possible to characterize dominant runoff processes using hourly discharge data, thereby enhancing our capability to interpret the dominant runoff processes of a watershed using observed discharge data alone.
Blind source separation of multichannel electroencephalogram based on wavelet transform and ICA
Institute of Scientific and Technical Information of China (English)
You Rong-Yi; Chen Zhong
2005-01-01
Combination of the wavelet transform and independent component analysis (ICA) was employed for blind source separation (BSS) of multichannel electroencephalogram (EEG). After denoising the original signals by discrete wavelet transform, high frequency components of some noises and artifacts were removed from the original signals. The denoised signals were reconstructed again for the purpose of ICA, such that the drawback that ICA cannot distinguish noises from source signals can be overcome effectively. The practical processing results showed that this method is an effective way to BSS of multichannel EEG. The method is actually a combination of wavelet transform with adaptive neural network, so it is also useful for BBS of other complex signals.
Frequency Domain Blind Source Separation for Robot Audition Using a Parameterized Sparsity Criterion
Abed-Meraim, Karim; Grenier, Y.; Maazaoui, Mounira
2011-01-01
In this paper, we introduce a modified lp norm blind source separation criterion based on the source sparsity in the timefrequency domain. We study the effect of making the sparsity constraint harder through the optimization process, making the parameter p of the lp norm vary from 1 to nearly 0 according to a sigmoid function. The sigmoid introduces a smooth lp norm variation which avoids the divergence of the algorithm. We compared this algorithm to the regular l1 norm minimization and an IC...
Czech Academy of Sciences Publication Activity Database
Šembera, Ondřej; Tichavský, Petr; Koldovský, Z.
Piscataway: IEEE, 2016, s. 4323-4327. ISBN 978-1-4799-9987-3. [IEEE International Conference on Acoustics, Speech, and Signal Processing 2016 (ICASSP2016). Shanghai (CN), 20.03.2016-25.03.2016] R&D Projects: GA ČR(CZ) GA14-13713S Institutional support: RVO:67985556 Keywords : Autoregressive Processes * Cramer-Rao Bound * Blind Source Separation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2016/SI/tichavsky-0458485.pdf
Estimating International Tourism Demand to Spain Separately by the Major Source Markets
Marcos Alvarez-Díaz; Manuel González-Gómez; Mª Soledad Otero-Giraldez
2012-01-01
The objective of this paper is to estimate international tourism demand to Spain separately by major source markets (Germany, United Kingdom, France, Italy and The Netherlands) that represent 67% of the international tourism to Spain. In order to investigate how the tourism demand reacts to price and income changes, we apply the bounds testing approach to cointegration and construct confidence intervals using the bootstrap technique. The results show differences in tourism behavior depending ...
Blind Source Separation Based of Brain Computer Interface System: A review
Ahmed Kareem Abdullah; Zhang Chao Zhu
2014-01-01
This study reviews the originality and development of the Brain Computer Interface (BCI) system and focus on the BCI system design based on Blind Source Separation (BSS) techniques. The study also provides the recent trends and discusses some of a new ideas for BSS techniques in BCI architecture, articles which discussing the BCI system development were analysed, types of the BCI systems and the recent BCI design were explored. Since 1970 when the research of BCI system began in the Californi...
Performance of Blind Source Separation Algorithms for FMRI Analysis using a Group ICA Method
Correa, Nicolle; Adali, Tülay; Calhoun, Vince D.
2006-01-01
Independent component analysis (ICA) is a popular blind source separation (BSS) technique that has proven to be promising for the analysis of functional magnetic resonance imaging (fMRI) data. A number of ICA approaches have been used for fMRI data analysis, and even more ICA algorithms exist, however the impact of using different algorithms on the results is largely unexplored. In this paper, we study the performance of four major classes of algorithms for spatial ICA, namely information max...
Congedo, Marco; Gouy-Pailler, Cedric; Jutten, Christian
2008-01-01
Over the last ten years blind source separation (BSS) has become a prominent processing tool in the study of human electroencephalography (EEG). Without relying on head modeling BSS aims at estimating both the waveform and the scalp spatial pattern of the intracranial dipolar current responsible of the observed EEG. In this review we begin by placing the BSS linear instantaneous model of EEG within the framework of brain volume conduction theory. We then review the concept and current practic...
Industrial applications of extended output-only Blind Source Separation techniques
Rutten, Christophe; Nguyen, Viet Ha; Golinval, Jean-Claude
2011-01-01
In the field of structural health monitoring or machine condition moni-toring, most vibration based methods reported in the literature require to measure responses at several locations on the structure. In machine condition monitoring, the number of available vibration sensors is often small and it is not unusual that only one single sensor is used to monitor a machine. This paper presents industrial applications of two possible extensions of output-only Blind Source Separation (BSS) techniqu...
RESEARCH OF QUANTUM GENETIC ALGORITH AND ITS APPLICATION IN BLIND SOURCE SEPARATION
Institute of Scientific and Technical Information of China (English)
YangJunan; LiBin; 等
2003-01-01
This letter proposes two algorithns:a novel Quantum Genetic Algorithm(QGA)based on the improvement of Han's Genetic Quantum Algorithm(GQA)and a new Blind Source Separation(BSS)method based on QGA and Independent Component Analysis(ICA).The simulation result shows that the efficiency of the new BSS nethod is obviously higher than that of the Conventional Genetic Algorithm(CGA).
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Role of the source to building lateral separation distance in petroleum vapor intrusion
Verginelli, Iason; Capobianco, Oriana; Baciocchi, Renato
2016-06-01
The adoption of source to building separation distances to screen sites that need further field investigation is becoming a common practice for the evaluation of the vapor intrusion pathway at sites contaminated by petroleum hydrocarbons. Namely, for the source to building vertical distance, the screening criteria for petroleum vapor intrusion have been deeply investigated in the recent literature and fully addressed in the recent guidelines issued by ITRC and U.S.EPA. Conversely, due to the lack of field and modeling studies, the source to building lateral distance received relatively low attention. To address this issue, in this work we present a steady-state vapor intrusion analytical model incorporating a piecewise first-order aerobic biodegradation limited by oxygen availability that accounts for lateral source to building separation. The developed model can be used to evaluate the role and relevance of lateral vapor attenuation as well as to provide a site-specific assessment of the lateral screening distances needed to attenuate vapor concentrations to risk-based values. The simulation outcomes showed to be consistent with field data and 3-D numerical modeling results reported in previous studies and, for shallow sources, with the screening criteria recommended by U.S.EPA for the vertical separation distance. Indeed, although petroleum vapors can cover maximum lateral distances up to 25-30 m, as highlighted by the comparison of model outputs with field evidences of vapor migration in the subsurface, simulation results by this new model indicated that, regardless of the source concentration and depth, 6 m and 7 m lateral distances are sufficient to attenuate petroleum vapors below risk-based values for groundwater and soil sources, respectively. However, for deep sources (> 5 m) and for low to moderate source concentrations (benzene concentrations lower than 5 mg/L in groundwater and 0.5 mg/kg in soil) the above criteria were found extremely conservative as
Energy Technology Data Exchange (ETDEWEB)
Biollaz, S.; Ludwig, Ch.; Stucki, S. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1999-08-01
A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.
Zwart, Jonathan T L; Jarvis, Matt J
2015-01-01
Measuring radio source counts is critical for characterizing new extragalactic populations, brings a wealth of science within reach and will inform forecasts for SKA and its pathfinders. Yet there is currently great debate (and few measurements) about the behaviour of the 1.4-GHz counts in the microJy regime. One way to push the counts to these levels is via 'stacking', the covariance of a map with a catalogue at higher resolution and (often) a different wavelength. For the first time, we cast stacking in a fully bayesian framework, applying it to (i) the SKADS simulation and (ii) VLA data stacked at the positions of sources from the VIDEO survey. In the former case, the algorithm recovers the counts correctly when applied to the catalogue, but is biased high when confusion comes into play. This needs to be accounted for in the analysis of data from any relatively-low-resolution SKA pathfinders. For the latter case, the observed radio source counts remain flat below the 5-sigma level of 85 microJy as far as 4...
Resonance ionization laser ion sources for on-line isotope separators (invited)
International Nuclear Information System (INIS)
A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented
Present status of singly charged ion ECR sources at the SARA on-line separator
International Nuclear Information System (INIS)
Various 2.45 GHz microwave electron cyclotron resonance (ECR) ion-sources designed with quartz tubes and without hexapole have been developed and tested for production, transport and focalization of singly-charged ions. A first on-line endeavour to separate radioactive isotopes in a He-jet coupled mode has been realized with a capillary skimmer ion-source injection system parallel to the source plasma axis. In order to improve the coupling of a ECR source with the He-jet system, a new compact metallic body ion-source with a skimmer-catcher injection arrangement perpendicular to the plasma has been designed. The layout of this new metallic ion-source is given. The ionization efficiencies have been measured as a function of gas pressure for a complete off-line regime with various support gases and for a dynamical regime induced with an He-jet injection simulating the subsequent on-line coupled mode conditions. (orig.)
PWC-ICA: A Method for Stationary Ordered Blind Source Separation with Application to EEG
Bigdely-Shamlo, Nima; Mullen, Tim; Robbins, Kay
2016-01-01
Independent component analysis (ICA) is a class of algorithms widely applied to separate sources in EEG data. Most ICA approaches use optimization criteria derived from temporal statistical independence and are invariant with respect to the actual ordering of individual observations. We propose a method of mapping real signals into a complex vector space that takes into account the temporal order of signals and enforces certain mixing stationarity constraints. The resulting procedure, which we call Pairwise Complex Independent Component Analysis (PWC-ICA), performs the ICA in a complex setting and then reinterprets the results in the original observation space. We examine the performance of our candidate approach relative to several existing ICA algorithms for the blind source separation (BSS) problem on both real and simulated EEG data. On simulated data, PWC-ICA is often capable of achieving a better solution to the BSS problem than AMICA, Extended Infomax, or FastICA. On real data, the dipole interpretations of the BSS solutions discovered by PWC-ICA are physically plausible, are competitive with existing ICA approaches, and may represent sources undiscovered by other ICA methods. In conjunction with this paper, the authors have released a MATLAB toolbox that performs PWC-ICA on real, vector-valued signals. PMID:27340397
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
Rey, Valentine; Rey, Christian
2016-01-01
This article deals with the computation of guaranteed lower bounds of the error in the framework of finite element (FE) and domain decomposition (DD) methods. In addition to a fully parallel computation, the proposed lower bounds separate the algebraic error (due to the use of a DD iterative solver) from the discretization error (due to the FE), which enables the steering of the iterative solver by the discretization error. These lower bounds are also used to improve the goal-oriented error estimation in a substructured context. Assessments on 2D static linear mechanic problems illustrate the relevance of the separation of sources of error and the lower bounds' independence from the substructuring. We also steer the iterative solver by an objective of precision on a quantity of interest. This strategy consists in a sequence of solvings and takes advantage of adaptive remeshing and recycling of search directions.
AN EME BLIND SOURCE SEPARATION ALGORITHM BASED ON GENERALIZED EXPONENTIAL FUNCTION
Institute of Scientific and Technical Information of China (English)
Miao Hao; Li Xiaodong; Tian Jing
2008-01-01
This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid function as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.
Dutta, R.; Wang, T.; Jonsson, S.
2014-12-01
A shallow magnitude 6.6 strike-slip earthquake occurred offshore west of the Fukuoka prefecture, Northern Kyushu Island in Japan in 2005. We use InSAR stable point-target data and GPS to constrain the location and source parameters of the mainshock. We use a uniform slip model on a rectangular dislocation in a homogenous elastic half-space and implement Bayesian estimation to obtain uncertainties for the derived model parameters. The offshore location of the earthquake makes the fault parameter estimation challenging, as the geodetic data only cover the area to the east of the earthquake. The marginal distributions of the source parameters show that the location of the western end of the fault is poorly constrained by the data whereas the eastern end, located closer to the shore, is better resolved. We use Gaussian a priori constraint on the moment magnitude (Mw 6.6) and the location of the fault with respect to the aftershock distribution of the earthquake and find the amount of fault slip to be in the range from 1 m to 1.3m with decreasing probability. We propagate the fault model uncertainties and calculate the variability of Coulomb Failure stress changes for the Kego fault, located directly below Fukuoka city, showing that the mainshock increased stress on the fault and brought it closer to failure, a concern for the Fukuoka city authorities and inhabitants.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Variational Blind Source Separation Toolbox and its Application to Hyperspectral Image Data
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
Piscataway: IEEE Computer Society, 2015, s. 1336-1340. ISBN 978-0-9928626-4-0. ISSN 2076-1465. [23rd European Signal Processing Conference (EUSIPCO). Nice (FR), 31.08.2015-04.09.2015] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Blind source separation * Variational Bayes method * Sparse prior * Hyperspectral image Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/tichy-0447094.pdf
Blind Source Separation in Farsi Language by Using Hermitian Angle in Convolutive Enviroment
Directory of Open Access Journals (Sweden)
Atefeh Soltani
2013-04-01
Full Text Available This paper presents a T-F masking method for convolutive blind source separation based on hermitian angle concept. The hermitian angle is calculated between T-F domain mixture vector and reference vector. Two different reference vectors are assumed for calculating two different hermitian angles, and then these angles are clustered with k-means or FCM method to estimate unmixing masks. The well-known permutation problem is solved based on k-means clustering of estimated masks which are partitioned to small groups. The experimental results show an improvement in performance when using two different reference vectors compared to only one.
Time-domain Blind Audio Source Separation Using Advanced Component Clustering and Reconstruction
Czech Academy of Sciences Publication Activity Database
Koldovský, Zbyněk; Tichavský, Petr
Trento: IEEE, 2008, s. 216-219. ISBN 978-1-4244-2337-8; ISBN 978-1-4244-2338-5. [Hands-free Speech Communication and Microphone Arrays 2008. Trento (IT), 06.05.2008-08.05.2008] R&D Projects: GA MŠk 1M0572 Grant ostatní: GA ČR(CZ) GP102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * audio signals Subject RIV: BI - Acoustics
Extension of EFICA Algorithm for Blind Separation of Piecewise Stationary Non-Gaussian Sources
Czech Academy of Sciences Publication Activity Database
Koldovský, Zbyněk; Málek, J.; Tichavský, Petr; Deville, Y.; Hosseini, S.
Bryan: Conference Management Services, 2008, s. 1913-1916. ISBN 978-1-4244-1483-3; ISBN 1-4244-1484-9. [ICASSP 2008, IEEE International Conference on Acoustics, Speech adn Signal Processing. Las Vegas (US), 30.03.2008-04.04.2008] R&D Projects: GA MŠk 1M0572 Grant ostatní: GA ČR(CZ) GP102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : independent component analysis * piecewise stationary signals * blind source separation Subject RIV: BB - Applied Statistics, Operational Research
Kinetic Modeling of the Dynamic PET Brain Data Using Blind Source Separation Methods
Czech Academy of Sciences Publication Activity Database
Tichý, Ondřej; Šmídl, Václav
Dalian, China: IEEE press, 2014, s. 244-249. ISBN 978-1-4799-5837-5. [The 2014 7th International Conference on BioMedical Engineering and Informatics. Dalian (CN), 14.10.2014-16.10.2014] R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * dynamic PET * input function * deconvolution Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/tichy-0433424.pdf
Generic Uniqueness of a Structured Matrix Factorization and Applications in Blind Source Separation
Domanov, Ignat; Lathauwer, Lieven De
2016-06-01
Algebraic geometry, although little explored in signal processing, provides tools that are very convenient for investigating generic properties in a wide range of applications. Generic properties are properties that hold "almost everywhere". We present a set of conditions that are sufficient for demonstrating the generic uniqueness of a certain structured matrix factorization. This set of conditions may be used as a checklist for generic uniqueness in different settings. We discuss two particular applications in detail. We provide a relaxed generic uniqueness condition for joint matrix diagonalization that is relevant for independent component analysis in the underdetermined case. We present generic uniqueness conditions for a recently proposed class of deterministic blind source separation methods that rely on mild source models. For the interested reader we provide some intuition on how the results are connected to their algebraic geometric roots.
Thibodeau, C; Monette, F; Glaus, M; Laflamme, C B
2011-01-01
The black water and grey water source-separation sanitation system aims at efficient use of energy (biogas), water and nutrients but currently lacks evidence of economic viability to be considered a credible alternative to the conventional system. This study intends to demonstrate economic viability, identify main cost contributors and assess critical influencing factors. A technico-economic model was built based on a new neighbourhood in a Canadian context. Three implementation scales of source-separation system are defined: 500, 5,000 and 50,000 inhabitants. The results show that the source-separation system is 33% to 118% more costly than the conventional system, with the larger cost differential obtained by lower source-separation system implementation scales. A sensitivity analysis demonstrates that vacuum toilet flow reduction from 1.0 to 0.25 L/flush decreases source-separation system cost between 23 and 27%. It also shows that high resource costs can be beneficial or unfavourable to the source-separation system depending on whether the vacuum toilet flow is low or normal. Therefore, the future of this configuration of the source-separation system lies mainly in vacuum toilet flow reduction or the introduction of new efficient effluent volume reduction processes (e.g. reverse osmosis). PMID:22170836
Escolano, Jose; Xiang, Ning; Perez-Lorenzo, Jose M; Cobos, Maximo; Lopez, Jose J
2014-02-01
Sound source localization using a two-microphone array is an active area of research, with considerable potential for use with video conferencing, mobile devices, and robotics. Based on the observed time-differences of arrival between sound signals, a probability distribution of the location of the sources is considered to estimate the actual source positions. However, these algorithms assume a given number of sound sources. This paper describes an updated research account on the solution presented in Escolano et al. [J. Acoust. Am. Soc. 132(3), 1257-1260 (2012)], where nested sampling is used to explore a probability distribution of the source position using a Laplacian mixture model, which allows both the number and position of speech sources to be inferred. This paper presents different experimental setups and scenarios to demonstrate the viability of the proposed method, which is compared with some of the most popular sampling methods, demonstrating that nested sampling is an accurate tool for speech localization. PMID:25234883
Resource recovery from source separated domestic waste(water) streams; full scale results.
Zeeman, Grietje; Kujawa-Roeleveld, Katarzyna
2011-01-01
A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation leading to various recovery products. Separate vacuum collection and transport followed by anaerobic treatment of concentrated black water (BW) demonstrated on a scale of 32 houses preserve 7.6 g/N/p/d and 0.63 gP/p/d amounting to respectively 69 and 48% of the theoretically produced N and P in the household, and 95% of the retained P was shown to be recoverable via struvite precipitation. Reuse of the anaerobic sludge in agriculture can substantially increase the P recovery. Energy recovery in the form of biogas from anaerobic digestion of concentrated BW, fits well in new concepts of sustainable, zero energy buildings. Nutrient recovery from separately collected urine lowers the percentage of nutrient recovery in comparison with BW but can, on the other hand, often be implemented in existing sanitation concepts. Theoretically 11gN/p/d and 1.0 g P/p/d are produced with urine, of which 38-63 and 34-61% were recovered in practice on a scale of 8-160 inhabitants in Sweden. New sanitation concepts with resource recovery and reuse are being demonstrated worldwide and more and more experience is being gained. PMID:22105119
Separation of Radio-Frequency Sources and Localization of Partial Discharges in Noisy Environments
Directory of Open Access Journals (Sweden)
Guillermo Robles
2015-04-01
Full Text Available The detection of partial discharges (PD can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD. Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO to minimize a distance function.
Micropollutant removal in an algal treatment system fed with source separated wastewater streams.
de Wilt, Arnoud; Butkovskyi, Andrii; Tuantet, Kanjana; Leal, Lucia Hernandez; Fernandes, Tânia V; Langenhoff, Alette; Zeeman, Grietje
2016-03-01
Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked pharmaceuticals (diclofenac, ibuprofen, paracetamol, metoprolol, carbamazepine and trimethoprim). Additionally, incorporation of these pharmaceuticals and three estrogens (estrone, 17β-estradiol and ethinylestradiol) into algal biomass was studied. Biodegradation and photolysis led to 60-100% removal of diclofenac, ibuprofen, paracetamol and metoprolol. Removal of carbamazepine and trimethoprim was incomplete and did not exceed 30% and 60%, respectively. Sorption to algal biomass accounted for less than 20% of the micropollutant removal. Furthermore, the presence of micropollutants did not inhibit C. sorokiniana growth at applied concentrations. Algal treatment systems allow simultaneous removal of micropollutants and recovery of nutrients from source separated wastewater. Nutrient rich algal biomass can be harvested and applied as fertilizer in agriculture, as lower input of micropollutants to soil is achieved when algal biomass is applied as fertilizer instead of urine. PMID:26546707
Carabias-Orti, Julio J.; Cobos, Máximo; Vera-Candeas, Pedro; Rodríguez-Serrano, Francisco J.
2013-12-01
Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge amount of prior information available in this scenario. In fact, besides the special properties of close-microphone tracks, the knowledge on the number and type of instruments making up the mixture can also be successfully exploited for improved separation performance. In this paper, a nonnegative matrix factorization (NMF) method making use of all the above information is proposed. To this end, a set of instrument models are learnt from a training database and incorporated into a multichannel extension of the NMF algorithm. Several options to initialize the algorithm are suggested, exploring their performance in multiple music tracks and comparing the results to other state-of-the-art approaches.
Fate of pharmaceuticals in full-scale source separated sanitation system.
Butkovskyi, A; Hernandez Leal, L; Rijnaarts, H H M; Zeeman, G
2015-11-15
Removal of 14 pharmaceuticals and 3 of their transformation products was studied in a full-scale source separated sanitation system with separate collection and treatment of black water and grey water. Black water is treated in an up-flow anaerobic sludge blanket (UASB) reactor followed by oxygen-limited autotrophic nitrification-denitrification in a rotating biological contactor and struvite precipitation. Grey water is treated in an aerobic activated sludge process. Concentration of 10 pharmaceuticals and 2 transformation products in black water ranged between low μg/l to low mg/l. Additionally, 5 pharmaceuticals were also present in grey water in low μg/l range. Pharmaceutical influent loads were distributed over two streams, i.e. diclofenac was present for 70% in grey water, while the other compounds were predominantly associated to black water. Removal in the UASB reactor fed with black water exceeded 70% for 9 pharmaceuticals out of the 12 detected, with only two pharmaceuticals removed by sorption to sludge. Ibuprofen and the transformation product of naproxen, desmethylnaproxen, were removed in the rotating biological contactor. In contrast, only paracetamol removal exceeded 90% in the grey water treatment system while removal of other 7 pharmaceuticals was below 40% or even negative. The efficiency of pharmaceutical removal in the source separated sanitation system was compared with removal in the conventional sewage treatment plants. Furthermore, effluent concentrations of black water and grey water treatment systems were compared with predicted no-effect concentrations to assess toxicity of the effluent. Concentrations of diclofenac, ibuprofen and oxazepam in both effluents were higher than predicted no-effect concentrations, indicating the necessity of post-treatment. Ciprofloxacin, metoprolol and propranolol were found in UASB sludge in μg/g range, while pharmaceutical concentrations in struvite did not exceed the detection limits. PMID:26364222
Directory of Open Access Journals (Sweden)
Valeriy Bekmuradov
2014-10-01
Full Text Available Production of biofuel such as ethanol from lignocellulosic biomass is a beneficial way to meet sustainability and energy security in the future. The main challenge in bioethanol conversion is the high cost of processing, in which enzymatic hydrolysis and fermentation are the major steps. Among the strategies to lower processing costs are utilizing both glucose and xylose sugars present in biomass for conversion. An approach featuring enzymatic hydrolysis and fermentation steps, identified as separate hydrolysis and fermentation (SHF was used in this work. Proposed solution is to use "pre-processing" technologies, including the thermal screw press (TSP and cellulose-organic-solvent based lignocellulose fractionation (COSLIF pretreatments. Such treatments were conducted on a widely available feedstock such as source separated organic waste (SSO to liberate all sugars to be used in the fermentation process. Enzymatic hydrolysis was featured with addition of commercial available enzyme, Accellerase 1500, to mediate enzymatic hydrolysis process. On average, the sugar yield from the TSP and COSLIF pretreatments followed by enzymatic hydrolysis was remarkable at 90%. In this work, evaluation of the SSO hydrolysate obtained from COSLIF and enzymatic hydrolysis pretreaments on ethanol yields was compared by fermentation results with two different recombinant strains: Zymomonas mobilis 8b and Saccharomyces cerevisiae DA2416. At 48 hours of fermentation, ethanol yield was equivalent to 0.48g of ethanol produced per gram of SSO biomass by Z.mobilis 8b and 0.50g of ethanol produced per gram of SSO biomass by S. cerevisiae DA2416. This study provides important insights for investigation of the source-separated organic (SSO waste on ethanol production by different strains and becomes a useful tool to facilitate future process optimization for pilot scale facilities.
Zwart, Jonathan T. L.; Santos, Mario; Jarvis, Matt J.
2015-10-01
Measuring radio source counts is critical for characterizing new extragalactic populations, brings a wealth of science within reach and will inform forecasts for SKA and its pathfinders. Yet there is currently great debate (and few measurements) about the behaviour of the 1.4-GHz counts in the μJy regime. One way to push the counts to these levels is via `stacking', the covariance of a map with a catalogue at higher resolution and (often) a different wavelength. For the first time, we cast stacking in a fully Bayesian framework, applying it to (i) the Square Kilometre Array Design Study (SKADS) simulation and (ii) Very Large Array (VLA) data stacked at the positions of sources from the VISTA Infra-red Deep Extragalactic Observations (VIDEO) survey. In the former case, the algorithm recovers the counts correctly when applied to the catalogue, but is biased high when confusion comes into play. This needs to be accounted for in the analysis of data from any relatively low-resolution Square Kilometre Array (SKA) pathfinders. For the latter case, the observed radio source counts remain flat below the 5-σ level of 85 μJy as far as 40 μJy, then fall off earlier than the flux hinted at by the SKADS simulations and a recent P(D) analysis (which is the only other measurement from the literature at these flux-density levels, itself extrapolated in frequency). Division into galaxy type via spectral-energy distribution reveals that normal spiral galaxies dominate the counts at these fluxes.
Welesameil, Mengstab Tilahun
2012-01-01
Filtration performance of three different non-woven geo-textiles (i.e. polypropylene and jute wool) to highly concentrated source separated black wastewater influent was evaluated in laboratory scale, aiming to optimize treatment process as pretreatments.
Tungkasthan, Anucha; Jongsawat, Nipat; Poompuang, Pittaya; Intarasema, Sarayut; Premchaiswadi, Wichian
2010-01-01
This paper presented a practical framework for automating the building of diagnostic BN models from data sources obtained from the WWW and demonstrates the use of a SMILE web-based interface to represent them. The framework consists of the following components: RSS agent, transformation/conversion tool, core reasoning engine, and the SMILE web-based interface. The RSS agent automatically collects and reads the provided RSS feeds according to the agent's predefined URLs. A transformation/conve...
Ichimaru, Satoshi; Hatayama, Masatoshi; Ohchi, Tadayuki; Gullikson, Eric M; Oku, Satoshi
2016-02-10
We describe the design and fabrication of a ruthenium beam separator used to simultaneously attenuate infrared light and reflect soft x rays. Measurements in the infrared and soft x-ray regions showed the beam separator to have a reflectivity of 50%-85% in the wavelength region from 6 to 10 nm at a grazing incidence angle of 7.5 deg and 4.3% at 800 nm and the same angle of grazing incidence, indicating that the amount of attenuation is 0.05-0.09. These results show that this beam separator could provide an effective means for separating IR light from soft x rays in light generated by high-order harmonic generation sources. PMID:26906363
Pires, Carlos A. L.; Ribeiro, Andreia F. S.
2016-04-01
We develop an expansion of space-distributed time series into statistically independent uncorrelated subspaces (statistical sources) of low-dimension and exhibiting enhanced non-Gaussian probability distributions with geometrically simple chosen shapes (projection pursuit rationale). The method relies upon a generalization of the principal component analysis that is optimal for Gaussian mixed signals and of the independent component analysis (ICA), optimized to split non-Gaussian scalar sources. The proposed method, supported by information theory concepts and methods, is the independent subspace analysis (ISA) that looks for multi-dimensional, intrinsically synergetic subspaces such as dyads (2D) and triads (3D), not separable by ICA. Basically, we optimize rotated variables maximizing certain nonlinear correlations (contrast functions) coming from the non-Gaussianity of the joint distribution. As a by-product, it provides nonlinear variable changes `unfolding' the subspaces into nearly Gaussian scalars of easier post-processing. Moreover, the new variables still work as nonlinear data exploratory indices of the non-Gaussian variability of the analysed climatic and geophysical fields. The method (ISA, followed by nonlinear unfolding) is tested into three datasets. The first one comes from the Lorenz'63 three-dimensional chaotic model, showing a clear separation into a non-Gaussian dyad plus an independent scalar. The second one is a mixture of propagating waves of random correlated phases in which the emergence of triadic wave resonances imprints a statistical signature in terms of a non-Gaussian non-separable triad. Finally the method is applied to the monthly variability of a high-dimensional quasi-geostrophic (QG) atmospheric model, applied to the Northern Hemispheric winter. We find that quite enhanced non-Gaussian dyads of parabolic shape, perform much better than the unrotated variables in which concerns the separation of the four model's centroid regimes
Instantaneous and Frequency-Warped Signal Processing Techniques for Auditory Source Separation.
Wang, Avery Li-Chun
This thesis summarizes several contributions to the areas of signal processing and auditory source separation. The philosophy of Frequency-Warped Signal Processing is introduced as a means for separating the AM and FM contributions to the bandwidth of a complex-valued, frequency-varying sinusoid p (n), transforming it into a signal with slowly-varying parameters. This transformation facilitates the removal of p (n) from an additive mixture while minimizing the amount of damage done to other signal components. The average winding rate of a complex-valued phasor is explored as an estimate of the instantaneous frequency. Theorems are provided showing the robustness of this measure. To implement frequency tracking, a Frequency-Locked Loop algorithm is introduced which uses the complex winding error to update its frequency estimate. The input signal is dynamically demodulated and filtered to extract the envelope. This envelope may then be remodulated to reconstruct the target partial, which may be subtracted from the original signal mixture to yield a new, quickly-adapting form of notch filtering. Enhancements to the basic tracker are made which, under certain conditions, attain the Cramer -Rao bound for the instantaneous frequency estimate. To improve tracking, the novel idea of Harmonic -Locked Loop tracking, using N harmonically constrained trackers, is introduced for tracking signals, such as voices and certain musical instruments. The estimated fundamental frequency is computed from a maximum-likelihood weighting of the N tracking estimates, making it highly robust. The result is that harmonic signals, such as voices, can be isolated from complex mixtures in the presence of other spectrally overlapping signals. Additionally, since phase information is preserved, the resynthesized harmonic signals may be removed from the original mixtures with relatively little damage to the residual signal. Finally, a new methodology is given for designing linear-phase FIR filters
The Doppler Effect based acoustic source separation for a wayside train bearing monitoring system
Zhang, Haibin; Zhang, Shangbin; He, Qingbo; Kong, Fanrang
2016-01-01
Wayside acoustic condition monitoring and fault diagnosis for train bearings depend on acquired acoustic signals, which consist of mixed signals from different train bearings with obvious Doppler distortion as well as background noises. This study proposes a novel scheme to overcome the difficulties, especially the multi-source problem in wayside acoustic diagnosis system. In the method, a time-frequency data fusion (TFDF) strategy is applied to weaken the Heisenberg's uncertainty limit for a signal's time-frequency distribution (TFD) of high resolution. Due to the Doppler Effect, the signals from different bearings have different time centers even with the same frequency. A Doppler feature matching search (DFMS) algorithm is then put forward to locate the time centers of different bearings in the TFD spectrogram. With the determined time centers, time-frequency filters (TFF) are designed with thresholds to separate the acoustic signals in the time-frequency domain. Then the inverse STFT (ISTFT) is taken and the signals are recovered and filtered aiming at each sound source. Subsequently, a dynamical resampling method is utilized to remove the Doppler Effect. Finally, accurate diagnosis for train bearing faults can be achieved by applying conventional spectrum analysis techniques to the resampled data. The performance of the proposed method is verified by both simulated and experimental cases. It shows that it is effective to detect and diagnose multiple defective bearings even though they produce multi-source acoustic signals.
Optimizing source detector separation for an implantable perfusion and oxygenation sensor
Akl, T. J.; King, T. J.; Long, R.; Baba, J. S.; McShane, M. J.; Ericson, M. N.; Wilson, M. A.; Coté, G. L.
2011-03-01
Each year thousands of patients are added to the waiting list for liver transplants. The first 7-10 days after transplant have proven to be the most critical in patient recovery and it is hypothesized that monitoring organ vital signals in this period can increase patient and graft survival rates. An implantable sensor to monitor the organ perfusion and oxygenation signals following surgery is being developed by our group. The sensor operates based on measuring diffuse reflection from three light emitting diodes (735, 805 and 940 nm). In this work the optimal source detector spacing to maximize oxygenation signal level is investigated for a portal vein model. Monte Carlo simulations provided signal levels and corresponding penetration depths as a function of separation between a point optical source and detector. The modeling results indicated a rapid decay in the optical signal with increasing distance. Through further analysis, it was found that there exists an optimal range of point source to detector spacing, between roughly 1 and 2 mm, in which the blood signal from the simulated portal vein was maximized. Overall, these results are being used to guide the placement and configuration of our probe for in vivo animal studies.
Sukholthaman, Pitchayanin; Sharp, Alice
2016-06-01
Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed. PMID:27026497
Residentsâ€™ Household Solid Waste (HSW) Source Separation Activity: A Case Study of Suzhou, China
Hua Zhang; Zong-Guo Wen
2014-01-01
Though the Suzhou government has provided household solid waste (HSW) source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residentsâ€™ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% o...
DEFF Research Database (Denmark)
Oh, Geok Lian; Brunskog, Jonas
2014-01-01
frequency-wavenumber processing to determine the location of the underground tunnel. Considering the case of determining the location of an underground tunnel, this paper proposed two physical models, the acoustic approximation ray tracing model and the finite difference time domain three-dimensional (3D......Techniques have been studied for the localization of an underground source with seismic interrogation signals. Much of the work has involved defining either a P-wave acoustic model or a dispersive surface wave model to the received signal and applying the time-delay processing technique and...
Non-contact time-resolved diffuse reflectance imaging at null source-detector separation.
Mazurenka, M; Jelzow, A; Wabnitz, H; Contini, D; Spinelli, L; Pifferi, A; Cubeddu, R; Mora, A Dalla; Tosi, A; Zappa, F; Macdonald, R
2012-01-01
We report results of the proof-of-principle tests of a novel non-contact tissue imaging system. The system utilizes a quasi-null source-detector separation approach for time-domain near-infrared spectroscopy, taking advantage of an innovative state-of-the-art fast-gated single photon counting detector. Measurements on phantoms demonstrate the feasibility of the non-contact approach for the detection of optically absorbing perturbations buried up to a few centimeters beneath the surface of a tissue-like turbid medium. The measured depth sensitivity and spatial resolution of the new system are close to the values predicted by Monte Carlo simulations for the inhomogeneous medium and an ideal fast-gated detector, thus proving the feasibility of the non-contact approach for high density diffuse reflectance measurements on tissue. Potential applications of the system are also discussed. PMID:22274351
Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms
Directory of Open Access Journals (Sweden)
Pan Qiongfeng
2007-01-01
Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.
Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms
Directory of Open Access Journals (Sweden)
Qiongfeng Pan
2007-04-01
Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.
Designing of a lead ion model source for plasma separation of spent nuclear fuel
Antonov, N. N.; Vorona, N. A.; Gavrikov, A. V.; Samokhin, A. A.; Smirnov, V. P.
2016-02-01
Plasma sources of model substances are required for solving problems associated with the development of a plasma separation method for spent nuclear fuel (SNF). Lead is chosen as the substance simulating the kinetics and dynamics of the heavy SNF component. We report on the results of analysis of the discharge in lead vapor with a concentration of 1012-1013 cm-3. Ionization is produced by an electron beam (with electron energy up to 500 eV) in the centimeter gap between planar electrodes. The discharge is simulated using the hydrodynamic and one-particle approximations. The current-voltage characteristics and efficiencies of single ionization depending on the vapor concentrations and thermoelectron current are obtained. The experimentally determined ion currents on the order of 100 μA for an ionization efficiency on the order of 0.1% are in conformity with the result of simulation.
Saline sewage treatment and source separation of urine for more sustainable urban water management.
Ekama, G A; Wilsenach, J A; Chen, G H
2011-01-01
While energy consumption and its associated carbon emission should be minimized in wastewater treatment, it has a much lower priority than human and environmental health, which are both closely related to efficient water quality management. So conservation of surface water quality and quantity are more important for sustainable development than green house gas (GHG) emissions per se. In this paper, two urban water management strategies to conserve fresh water quality and quantity are considered: (1) source separation of urine for improved water quality and (2) saline (e.g. sea) water toilet flushing for reduced fresh water consumption in coastal and mining cities. The former holds promise for simpler and shorter sludge age activated sludge wastewater treatment plants (no nitrification and denitrification), nutrient (Mg, K, P) recovery and improved effluent quality (reduced endocrine disruptor and environmental oestrogen concentrations) and the latter for significantly reduced fresh water consumption, sludge production and oxygen demand (through using anaerobic bioprocesses) and hence energy consumption. Combining source separation of urine and saline water toilet flushing can reduce sewer crown corrosion and reduce effluent P concentrations. To realize the advantages of these two approaches will require significant urban water management changes in that both need dual (fresh and saline) water distribution and (yellow and grey/brown) wastewater collection systems. While considerable work is still required to evaluate these new approaches and quantify their advantages and disadvantages, it would appear that the investment for dual water distribution and wastewater collection systems may be worth making to unlock their benefits for more sustainable urban development. PMID:22214085
Magnetic source separation in the outer core. Introducing the SCOR-field
International Nuclear Information System (INIS)
Complete text of publication follows. We present evidence that the primary source of Earth's axial dipole (AD) is physically distinct from sources responsible for the rest of the geomagnetic field. Support for this claim comes from correlations between the structure of the historic non-axial dipole (NAD) field and transitional paleomagnetic behavior recorded in lavas during the early Brunhes Chron. 40Ar/39Ar age determinations of lavas from West Eifel, Germany, indicate the recording of five excursions spanning ∼200 kyr, including the Big Lost Event (∼580 ka). Transitional lavas from Tahiti also record the Big Lost as well as the Matuyama-Brunhes reversal. Virtual geomagnetic poles (VGPs) recorded at West Eifel are spread across Eurasia, while those recorded on Tahiti during the two events are associated with the same tightly clustered location west of Australia - the site of the most intense NAD flux feature since direct field measurements started some 400 years ago. The differing locations and amounts of spread of transitional VGPs match - at both sites - virtual poles determined for the historic NAD-field. We contend that (1) the field generated by deep convective columns near the tangent cylinder is the primary source for the AD; and (2) the field arising from flux concentrations held and controlled by lower mantle conditions is the primary source for the NAD. Since there most certainly is a small contribution to the AD term (g10) associated with mantle-held sources, we define this field as the Shallow-Core-Generated (SCOR) field. Paleomagnetic data from Tahiti and Australia strongly suggest that the Australasian flux feature is long-lived, regionally dominating the field when the strength of the main AD had significantly weakened or vanished. We argue that recurrence of transitional VGPs observed over geologic time indicates that (1) the entire field does not reverse as a single unit, and (2) field sources exist in the core that are sufficiently separated
Directory of Open Access Journals (Sweden)
Hongyun Han
2016-07-01
Full Text Available This paper examines how and to what degree government policies of garbage fees and voluntary source separation programs, with free indoor containers and garbage bags, can affect the effectiveness of municipal solid waste (MSW management, in the sense of achieving a desirable reduction of per capita MSW generation. Based on city-level panel data for years 1998–2012 in China, our empirical analysis indicates that per capita MSW generated is increasing with per capita disposable income, average household size, education levels of households, and the lagged per capita MSW. While both garbage fees and source separation programs have separately led to reductions in per capita waste generation, the interaction of the two policies has resulted in an increase in per capita waste generation due to the following crowding-out effects: Firstly, the positive effect of income dominates the negative effect of the garbage fee. Secondly, there are crowding-out effects of mandatory charging system and the subsidized voluntary source separation on per capita MSW generation. Thirdly, small subsidies and tax punishments have reduced the intrinsic motivation for voluntary source separation of MSW. Thus, compatible fee charging system, higher levels of subsidies, and well-designed public information and education campaigns are required to promote household waste source separation and reduction.
System identification through nonstationary data using Time-Frequency Blind Source Separation
Guo, Yanlin; Kareem, Ahsan
2016-06-01
Classical output-only system identification (SI) methods are based on the assumption of stationarity of the system response. However, measured response of buildings and bridges is usually non-stationary due to strong winds (e.g. typhoon, and thunder storm etc.), earthquakes and time-varying vehicle motions. Accordingly, the response data may have time-varying frequency contents and/or overlapping of modal frequencies due to non-stationary colored excitation. This renders traditional methods problematic for modal separation and identification. To address these challenges, a new SI technique based on Time-Frequency Blind Source Separation (TFBSS) is proposed. By selectively utilizing "effective" information in local regions of the time-frequency plane, where only one mode contributes to energy, the proposed technique can successfully identify mode shapes and recover modal responses from the non-stationary response where the traditional SI methods often encounter difficulties. This technique can also handle response with closely spaced modes which is a well-known challenge for the identification of large-scale structures. Based on the separated modal responses, frequency and damping can be easily identified using SI methods based on a single degree of freedom (SDOF) system. In addition to the exclusive advantage of handling non-stationary data and closely spaced modes, the proposed technique also benefits from the absence of the end effects and low sensitivity to noise in modal separation. The efficacy of the proposed technique is demonstrated using several simulation based studies, and compared to the popular Second-Order Blind Identification (SOBI) scheme. It is also noted that even some non-stationary response data can be analyzed by the stationary method SOBI. This paper also delineates non-stationary cases where SOBI and the proposed scheme perform comparably and highlights cases where the proposed approach is more advantageous. Finally, the performance of the
Jaatinen, Sanna; Kivistö, Anniina; Palmroth, Marja R T; Karp, Matti
2016-09-01
The objective was to demonstrate that a microbial whole cell biosensor, bioluminescent yeast, Saccharomyces cerevisiae (BMAEREluc/ERα) can be applied to detect overall estrogenic activity from fresh and stored human urine. The use of source-separated urine in agriculture removes a human originated estrogen source from wastewater influents, subsequently enabling nutrient recycling. Estrogenic activity in urine should be diminished prior to urine usage in agriculture in order to prevent its migration to soil. A storage period of 6 months is required for hygienic reasons; therefore, estrogenic activity monitoring is of interest. The method measured cumulative female hormone-like activity. Calibration curves were prepared for estrone, 17β-estradiol, 17α- ethinylestradiol and estriol. Estrogen concentrations of 0.29-29,640 μg L(-1) were detectable while limit of detection corresponded to 0.28-35 μg L(-1) of estrogens. The yeast sensor responded well to fresh and stored urine and gave high signals corresponding to 0.38-3,804 μg L(-1) of estrogens in different urine samples. Estrogenic activity decreased during storage, but was still higher than in fresh urine implying insufficient storage length. The biosensor was suitable for monitoring hormonal activity in urine and can be used in screening anthropogenic estrogen-like compounds interacting with the receptor. PMID:26804108
The source regions of solar energetic particles detected by widely separated spacecraft
Energy Technology Data Exchange (ETDEWEB)
Park, Jinhye; Moon, Y.-J. [School of Space Research, Kyung Hee University, Yongin 446-701 (Korea, Republic of); Innes, D. E.; Bucik, R., E-mail: jinhye@khu.ac.kr [Max Planck Institute for Solar System Research, D-37191 Katlenburg-Lindau (Germany)
2013-12-20
We studied the source regions of 12 solar energetic particle (SEP) events seen between 2010 August and 2012 January at STEREO-A, B, and/or Earth (Advanced Composition Explorer/Solar and Heliospheric Observatory/GOES), when the two STEREO spacecraft were separated by about 180°. All events were associated with flares (C1 to X6) and fast coronal mass ejections and, except for one, accompanied by type II radio bursts. We have determined the arrival times of the SEPs at the three positions. Extreme ultraviolet (EUV) waves, observed in the 195 Å and 193 Å channels of STEREO and the Solar Dynamics Observatory, are tracked across the Sun to determine their arrival time at the photospheric source of open field lines connecting to the spacecraft. There is a good correlation between the EUV wave arrival times at the connecting footpoints and the SEP onset times. The delay time between electron onset and the EUV wave reaching the connecting footpoint is independent of distance from the flare site. The proton delay time increases with distance from the flare site. In three of the events, secondary flare sites may have also contributed to the wide longitudinal spread of SEPs.
Detection of sudden structural damage using blind source separation and time-frequency approaches
Morovati, V.; Kazemi, M. T.
2016-05-01
Seismic signal processing is one of the most reliable methods of detecting the structural damage during earthquakes. In this paper, the use of the hybrid method of blind source separation (BSS) and time-frequency analysis (TFA) is explored to detect the changes in the structural response data. The combination of the BSS and TFA is applied to the seismic signals due to the non-stationary nature of them. Firstly, the second-order blind identification technique is used to decompose the response signal of structural vibration into modal coordinate signals which will be mono-components for TFA. Then each mono-component signal is analyzed to extract instantaneous frequency of structure. Numerical simulations and a real-world seismic-excited structure with time-varying frequencies show the accuracy and robustness of the developed algorithm. TFA of extracted sources shows that used method can be successfully applied to structural damage detection. The results also demonstrate that the combined method can be used to identify the time instant of structural damage occurrence more sharply and effectively than by the use of TFA alone.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
Bayesian Stable Isotope Mixing Models
Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard
2012-01-01
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
The distressed brain: a group blind source separation analysis on tinnitus.
Directory of Open Access Journals (Sweden)
Dirk De Ridder
Full Text Available BACKGROUND: Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. METHODOLOGY: In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ, an electroencephalography (EEG is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30 and high distress (N = 25. The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS analysis is performed using a large normative sample (N = 84, generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed. CONCLUSIONS: Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis
Directory of Open Access Journals (Sweden)
Samuel Akwei-Sekyere
2015-07-01
Full Text Available The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio.
Musafere, F.; Sadhu, A.; Liu, K.
2016-01-01
In the last few decades, structural health monitoring (SHM) has been an indispensable subject in the field of vibration engineering. With the aid of modern sensing technology, SHM has garnered significant attention towards diagnosis and risk management of large-scale civil structures and mechanical systems. In SHM, system identification is one of major building blocks through which unknown system parameters are extracted from vibration data of the structures. Such system information is then utilized to detect the damage instant, and its severity to rehabilitate and prolong the existing health of the structures. In recent years, blind source separation (BSS) algorithm has become one of the newly emerging advanced signal processing techniques for output-only system identification of civil structures. In this paper, a novel damage detection technique is proposed by integrating BSS with the time-varying auto-regressive modeling to identify the instant and severity of damage. The proposed method is validated using a suite of numerical studies and experimental models followed by a full-scale structure.
Powerline noise elimination in biomedical signals via blind source separation and wavelet analysis.
Akwei-Sekyere, Samuel
2015-01-01
The distortion of biomedical signals by powerline noise from recording biomedical devices has the potential to reduce the quality and convolute the interpretations of the data. Usually, powerline noise in biomedical recordings are extinguished via band-stop filters. However, due to the instability of biomedical signals, the distribution of signals filtered out may not be centered at 50/60 Hz. As a result, self-correction methods are needed to optimize the performance of these filters. Since powerline noise is additive in nature, it is intuitive to model powerline noise in a raw recording and subtract it from the raw data in order to obtain a relatively clean signal. This paper proposes a method that utilizes this approach by decomposing the recorded signal and extracting powerline noise via blind source separation and wavelet analysis. The performance of this algorithm was compared with that of a 4th order band-stop Butterworth filter, empirical mode decomposition, independent component analysis and, a combination of empirical mode decomposition with independent component analysis. The proposed method was able to expel sinusoidal signals within powerline noise frequency range with higher fidelity in comparison with the mentioned techniques, especially at low signal-to-noise ratio. PMID:26157639
Insects associated with the composting process of solid urban waste separated at the source
Directory of Open Access Journals (Sweden)
Gladis Estela Morales
2010-01-01
Full Text Available Sarcosaprophagous macroinvertebrates (earthworms, termites and a number of Diptera larvae enhance changes in the physical and chemical properties of organic matter during degradation and stabilization processes in composting, causing a decrease in the molecular weights of compounds. This activity makes these organisms excellent recyclers of organic matter. This article evaluates the succession of insects associated with the decomposition of solid urban waste separated at the source. The study was carried out in the city of Medellin, Colombia. A total of 11,732 individuals were determined, belonging to the classes Insecta and Arachnida. Species of three orders of Insecta were identified, Diptera, Coleoptera and Hymenoptera. Diptera corresponding to 98.5% of the total, was the most abundant and diverse group, with 16 families (Calliphoridae, Drosophilidae, Psychodidae, Fanniidae, Muscidae, Milichiidae, Ulidiidae, Scatopsidae, Sepsidae, Sphaeroceridae, Heleomyzidae, Stratiomyidae, Syrphidae, Phoridae, Tephritidae and Curtonotidae followed by Coleoptera with five families (Carabidae, Staphylinidae, Ptiliidae, Hydrophilidae and Phalacaridae. Three stages were observed during the composting process, allowing species associated with each stage to be identified. Other species were also present throughout the whole process. In terms of number of species, Diptera was the most important group observed, particularly Ornidia obesa, considered a highly invasive species, and Hermetia illuscens, both reported as beneficial for decomposition of organic matter.
Sun, Y; Finlayson-Pitts, B J; Xin, J
2011-01-01
Differential optical absorption spectroscopy (DOAS) is a powerful tool for detecting and quantifying trace gases in atmospheric chemistry \\cite{Platt_Stutz08}. DOAS spectra consist of a linear combination of complex multi-peak multi-scale structures. Most DOAS analysis routines in use today are based on least squares techniques, for example, the approach developed in the 1970s uses polynomial fits to remove a slowly varying background, and known reference spectra to retrieve the identity and concentrations of reference gases. An open problem is to identify unknown gases in the fitting residuals for complex atmospheric mixtures. In this work, we develop a novel three step semi-blind source separation method. The first step uses a multi-resolution analysis to remove the slow-varying and fast-varying components in the DOAS spectral data matrix $X$. The second step decomposes the preprocessed data $\\hat{X}$ in the first step into a linear combination of the reference spectra plus a remainder, or $\\hat{X} = A\\,S +...
High acceptance of urine source separation in seven European countries: a review.
Lienert, Judit; Larsen, Tove A
2010-01-15
Urine source separation (NoMix-technology) is a promising innovation aiming at a resource-oriented, decentralized approach in urban water management. However, NoMix-technology has a sensitive end-point: people's bathrooms. NoMix-technology is increasingly applied in European pilot projects, but the success from a user point-of-view has rarely been systematically monitored. We aim at closing this gap. We review surveys on acceptance, including reuse of human urine as fertilizer, from 38 NoMix-projects in 7 Northern and Central European countries with 2700 respondents. Additionally, we identify explanatory variables with logistic regression of a representative Swiss library survey. NoMix-technology is well accepted; around 80% of users liked the idea, 75-85% were satisfied with design, hygiene, smell, and seating comfort of NoMix-toilets, 85% regarded urine-fertilizers as good idea (50% of farmers), and 70% would purchase such food. However, 60% of users encountered problems; NoMix-toilets need further development. We found few differences among countries, but systematic differences between public and private settings, where people seem more critical. Information was positively correlated with acceptance, and, e.g., a good mood or environmentally friendly behavior. For future success of NoMix-projects, we recommend authorities follow an integral strategy. Lay people will then find the NoMix-concept appealing and support this promising bathroom innovation. PMID:20000706
Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface
Directory of Open Access Journals (Sweden)
Michael H. Thaut
2005-11-01
Full Text Available Most EEG-based BCI systems make use of well-studied patterns of brain activity. However, those systems involve tasks that indirectly map to simple binary commands such as Ã¢Â€ÂœyesÃ¢Â€Â or Ã¢Â€ÂœnoÃ¢Â€Â or require many weeks of biofeedback training. We hypothesized that signal processing and machine learning methods can be used to discriminate EEG in a direct Ã¢Â€ÂœyesÃ¢Â€Â/Ã¢Â€ÂœnoÃ¢Â€Â BCI from a single session. Blind source separation (BSS and spectral transformations of the EEG produced a 180-dimensional feature space. We used a modified genetic algorithm (GA wrapped around a support vector machine (SVM classifier to search the space of feature subsets. The GA-based search found feature subsets that outperform full feature sets and random feature subsets. Also, BSS transformations of the EEG outperformed the original time series, particularly in conjunction with a subset search of both spaces. The results suggest that BSS and feature selection can be used to improve the performance of even a Ã¢Â€Âœdirect,Ã¢Â€Â single-session BCI.
Directory of Open Access Journals (Sweden)
Farkhan Rosi
2013-09-01
Full Text Available Pada sistem komunikasi bawah air, seringkali sinyal yang diterima oleh sensor berasal dari hasil pencampuran sumber sinyal dengan sinyal-sinyal akustik lain di lingkungan bawah air. Hal ini menjadikan sinyal yang didapatkan menjadi tidak sesuai dengan yang diinginkan. Teknik Blind Source Separation (BSS dipakai di sini untuk memisahkan sinyal-sinyal yang bercampur tersebut. Dalam tugas akhir ini, dilakukan pemisahan sinyal akustik dengan menggunakan Natural Gradient ICA berdasarkan Generalized Gaussian Model yang didapat dari karakteristik distribusi sumber sinyal akustik non-gaussian yakni ship radiated noise dan sea ambient noise. Pemisahan sinyal akustik dilakukan sebanyak tiga kali yakni dengan simulasi, toolbox ICALABS V3, dan menggunakan pemisahan sinyal akustik dari data riil pengukuran. Dari hasil simulasi menunjukkan pemisahan dengan algoritma Natural Gradien ICA berdasarkan Generalized Gaussian Model berjalan dengan baik. Hal ini ditunjukkan dengan nilai SIR shrimp.wav = 48.9946 dB dan ferry.wav = 46.9309. dB. Sedangkan rata-rata MSE shrimp.wav = 1.2605 x 10-5 dan ferry.wav = 2.0272 x 10 -5.
DEFF Research Database (Denmark)
Hansen, Trine Lund; Svärd, Å; Angelidaki, Irini;
2003-01-01
A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical composit......A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical...... composition of the wastes and the estimated methane potentials....
Institute of Scientific and Technical Information of China (English)
Kazi Takpaya; Wei Gang
2003-01-01
Blind identification-blind equalization for Finite Impulse Response (FIR) Multiple Input-Multiple Output (MIMO) channels can be reformulated as the problem of blind sources separation. It has been shown that blind identification via decorrelating sub-channels method could recover the input sources. The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators, which decorrelate the output signals of subchannels, and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix. In this paper, a new approximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed. The proposed method outperforms BIDS in the presence of additive white Gaussian noise.
Institute of Scientific and Technical Information of China (English)
AaziTakpaya; WeiGang
2003-01-01
Blind identification-blind equalization for finite Impulse Response(FIR)Multiple Input-Multiple Output(MIMO)channels can be reformulated as the problem of blind sources separation.It has been shown that blind identification via decorrelating sub-channels method could recover the input sources.The Blind Identification via Decorrelating Sub-channels(BIDS)algorithm first constructs a set of decorrelators,which decorrelate the output signals of subchannels,and then estimates the channel matrix using the transfer functions of the decorrelators and finally recovers the input signal using the estimated channel matrix.In this paper,a new qpproximation of the input source for FIR-MIMO channels based on the maximum likelihood source separation method is proposed.The proposed method outperforms BIDS in the presence of additive white Garssian noise.
International Nuclear Information System (INIS)
The data on 151Sm purification from rare earths, actinides and other elements present in the fraction which is a waste of fission products processing with the use of extraction-chromatography separation technique (hdehp is extracting agent, teflon is the carrier), are presented. Results of γ-spectrometric analysis of purified preparations are considered, as well as some problems of preparing 151Sm-based Moessbauer sources, namely, the choice of the chemical form of active material and source design
Deep Transform: Cocktail Party Source Separation via Probabilistic Re-Synthesis
Simpson, Andrew J. R.
2015-01-01
In cocktail party listening scenarios, the human brain is able to separate competing speech signals. However, the signal processing implemented by the brain to perform cocktail party listening is not well understood. Here, we trained two separate convolutive autoencoder deep neural networks (DNN) to separate monaural and binaural mixtures of two concurrent speech streams. We then used these DNNs as convolutive deep transform (CDT) devices to perform probabilistic re-synthesis. The CDTs operat...
Abban, B. K.; Papanicolaou, T.; Wilson, C. G.; Abaci, O.; Wacha, K.; Schnoebelen, D. J.; Rhoads, B. L.; Yu, M.
2015-12-01
We apply an enhanced revision of a Bayesian un-mixing framework for estimating sediment source contributions to an intensively managed watershed in the US Midwest that is characterized by spatiotemporal heterogeneity. The framework has two key parameters, namely a and b, that account for spatial origin attributes and the time history of sediments delivered to the watershed outlet, respectively. The probabilistic treatment of these parameters incorporates variability in source erosion processes, as well as the delivery times and storage of eroded material within the watershed. Uncertainty in source contribution estimates in our approach is quantified naturally through the use of Markov Chain Monte Carlo simulations for estimating the posterior probability density functions. The studied watershed is the 26 km² South Amana Sub-Watershed located within the Clear Creek Watershed (CCW), IA, which is part of the Critical Zone Observatory for Intensively Managed Landscapes (IML-CZO). Utilizing stable isotopes of C and N, the Bayesian framework predicted trends in mean source contributions and uncertainty that were in agreement with observations from other studies. Terrestrial sources were found to dominate sediment contributions in the earlier stages of the growing season when land cover was small and the hydrologic forcing was large. Instream sources became more dominate during the latter stages when land cover was well-established and more extensive. Also, the effects of spatial heterogeneity and sediment travel time and delivery on uncertainty in sources contribution estimates were adequately captured. The results clearly showed a reduction in uncertainty when watershed connectivity was greatest and considerable amounts of material were delivered to the collection point at the outlet. On-going application of the framework to the Upper Sangamon River Basin (USRB), which also part of the IML-CZO and has distinct features from CCW, is expected to shed more light on the
Directory of Open Access Journals (Sweden)
Saswati Swapna Dash
2014-07-01
Full Text Available This paper presents an overall study of Feedback Control of Z-Source Converter Fed Separately excited DC motor with centrifugal Pump Set. Z-source converter can be used for both voltage buck and boost mode using LC impedance network. In this paper the dynamic modeling of Z-source with motor load and centrifugal pump set is carried out with new findings. The compensators for speed feedback loop are designed by taking average state space analysis and small signal model of the system. The feedback loop is designed by classical control methods. The experiment is done in MATLAB work environment and the result is verified by Simulation.
Saswati Swapna Dash; Byamakesh Nayak; Subrat Kumar
2014-01-01
This paper presents an overall study of Feedback Control of Z-Source Converter Fed Separately excited DC motor with centrifugal Pump Set. Z-source converter can be used for both voltage buck and boost mode using LC impedance network. In this paper the dynamic modeling of Z-source with motor load and centrifugal pump set is carried out with new findings. The compensators for speed feedback loop are designed by taking average state space analysis and small signal model of the system. The feedba...
Directory of Open Access Journals (Sweden)
He Peng Ju
2016-01-01
Full Text Available Nowadays, detecting fetal ECG using abdominal signal is a commonly used method, but fetal ECG signal will be affected by maternal ECG. Current FECG extraction algorithms are mainly aiming at multiple channels signal. They often assume there is only one fetus and did not consider multiple births. This paper proposed a single channel blind source separation (SCBSS algorithm based on source number estimation using multi-algorithm fusion to process single abdominal signal. The method decomposed collected single channel signal into multiple intrinsic mode function (IMF utilizing Empirical Mode Decomposition (EMD, mapping single channel into multiple channels. Four multiple channel source number estimation (MCSNE methods (Bootstrap, Hough, AIC and PCA were weighting fused to estimate accurate source number and the particle swarm optimization algorithm (PSO was employed to determine weighted coefficient. According to source number and IMF, nonnegative matrix was constructed and nonnegative matrix factorization (NMF was employed to separate mixed signals. Experiments used single channel signal mixed by four man-made signals and single channel ECG mixed by two to verify the proposed algorithm. Results showed that the proposed algorithm could determine number of independent signal in single acquired signal. FECG could be extracted from single channel observed signal and the algorithm can be used to solve separation of MECG and FECG.
Yang, Yang; Li, Xiukun
2016-06-01
Separation of the components of rigid acoustic scattering by underwater objects is essential in obtaining the structural characteristics of such objects. To overcome the problem of rigid structures appearing to have the same spectral structure in the time domain, time-frequency Blind Source Separation (BSS) can be used in combination with image morphology to separate the rigid scattering components of different objects. Based on a highlight model, the separation of the rigid scattering structure of objects with time-frequency distribution is deduced. Using a morphological filter, different characteristics in a Wigner-Ville Distribution (WVD) observed for single auto term and cross terms can be simplified to remove any cross-term interference. By selecting time and frequency points of the auto terms signal, the accuracy of BSS can be improved. An experimental simulation has been used, with changes in the pulse width of the transmitted signal, the relative amplitude and the time delay parameter, in order to analyzing the feasibility of this new method. Simulation results show that the new method is not only able to separate rigid scattering components, but can also separate the components when elastic scattering and rigid scattering exist at the same time. Experimental results confirm that the new method can be used in separating the rigid scattering structure of underwater objects.
Pathogens and pharmaceuticals in source-separated urine in eThekwini, South Africa.
Bischel, Heather N; Özel Duygan, Birge D; Strande, Linda; McArdell, Christa S; Udert, Kai M; Kohn, Tamar
2015-11-15
In eThekwini, South Africa, the production of agricultural fertilizers from human urine collected from urine-diverting dry toilets is being evaluated at a municipality scale as a way to help finance a decentralized, dry sanitation system. The present study aimed to assess a range of human and environmental health hazards in source-separated urine, which was presumed to be contaminated with feces, by evaluating the presence of human pathogens, pharmaceuticals, and an antibiotic resistance gene. Composite urine samples from households enrolled in a urine collection trial were obtained from urine storage tanks installed in three regions of eThekwini. Polymerase chain reaction (PCR) assays targeted 9 viral and 10 bacterial human pathogens transmitted by the fecal-oral route. The most frequently detected viral pathogens were JC polyomavirus, rotavirus, and human adenovirus in 100%, 34% and 31% of samples, respectively. Aeromonas spp. and Shigella spp. were frequently detected gram negative bacteria, in 94% and 61% of samples, respectively. The gram positive bacterium, Clostridium perfringens, which is known to survive for extended times in urine, was found in 72% of samples. A screening of 41 trace organic compounds in the urine facilitated selection of 12 priority pharmaceuticals for further evaluation. The antibiotics sulfamethoxazole and trimethoprim, which are frequently prescribed as prophylaxis for HIV-positive patients, were detected in 95% and 85% of samples, reaching maximum concentrations of 6800 μg/L and 1280 μg/L, respectively. The antiretroviral drug emtricitabine was also detected in 40% of urine samples. A sulfonamide antibiotic resistance gene (sul1) was detected in 100% of urine samples. By coupling analysis of pathogens and pharmaceuticals in geographically dispersed samples in eThekwini, this study reveals a range of human and environmental health hazards in urine intended for fertilizer production. Collection of urine offers the benefit of
Single channel source separation of radar fuze mixed signal based on phase difference analysis
Institute of Scientific and Technical Information of China (English)
Hang ZHU; Shu-ning ZHANG; Hui-chang ZHAO
2014-01-01
A new method based on phase difference analysis is proposed for the single-channel mixed signal separation of single-channel radar fuze. This method is used to estimate the mixing coefficients of de-noised signals through the cumulants of mixed signals, solve the candidate data set by the mixing coefficients and signal analytical form, and resolve the problem of vector ambiguity by analyzing the phase differences. The signal separation is realized by exchanging data of the solutions. The waveform similarity coefficients are calculated, and the timeefrequency dis-tributions of separated signals are analyzed. The results show that the proposed method is effective.
Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation
DEFF Research Database (Denmark)
Schmidt, Mikkel N.; Mørup, Morten
2006-01-01
We present a novel method for blind separation of instruments in polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding to...... individual instruments. Based on this factorization we separate the instruments using spectrogram masking. The proposed algorithm has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription....
Rubin, Donald B.
1981-01-01
The Bayesian bootstrap is the Bayesian analogue of the bootstrap. Instead of simulating the sampling distribution of a statistic estimating a parameter, the Bayesian bootstrap simulates the posterior distribution of the parameter; operationally and inferentially the methods are quite similar. Because both methods of drawing inferences are based on somewhat peculiar model assumptions and the resulting inferences are generally sensitive to these assumptions, neither method should be applied wit...
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
DEFF Research Database (Denmark)
Fernandez Grande, Efren; Jacobsen, Finn
2010-01-01
A method of estimating the sound field radiated by a source under non-anechoic conditions has been examined. The method uses near field acoustic holography based on a combination of pressure and particle velocity measurements in a plane near the source for separating outgoing and ingoing wave...... components. The outgoing part of the sound field is composed of both radiated and scattered waves. The method compensates for the scattered components of the outgoing field on the basis of the boundary condition of the problem, exploiting the fact that the sound field is reconstructed very close...... to the source. Thus the radiated free-field component is estimated simultaneously with solving the inverse problem of reconstructing the sound field near the source. The method is particularly suited to cases in which the overall contribution of reflected sound in the measurement plane is significant....
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574
2015-01-01
Distributed temperature sensing (DTS) provides an important technology support for the earth-rock junctions of dike projects (ERJD), which are binding sites between culvert, gates, and pipes and dike body and dike foundation. In this study, a blind source separation model is used for the identification of leakages based on the temperature data of DTS in leakage monitoring of ERJD. First, a denoising method is established based on the temperature monitoring data of distributed optical fiber in...
Bai, Mingsian R; Yao, Yueh Hua; Lai, Chang-Sheng; Lo, Yi-Yang
2016-03-01
In this paper, four delay-and-sum (DAS) beamformers formulated in the modal domain and the space domain for open and solid spherical apertures are examined through numerical simulations. The resulting beampatterns reveal that the mainlobe of the solid spherical DAS array is only slightly narrower than that of the open array, whereas the sidelobes of the modal domain array are more significant than those of the space domain array due to the discrete approximation of continuous spherical Fourier transformation. To verify the theory experimentally, a three-dimensionally printed spherical array on which 32 micro-electro-mechanical system microphones are mounted is utilized for localization and separation of sound sources. To overcome the basis mismatch problem in signal separation, source localization is first carried out using minimum variance distortionless response beamformer. Next, Tikhonov regularization (TIKR) and compressive sensing (CS) are employed to extract the source signal amplitudes. Simulations and experiments are conducted to validate the proposed spherical array system. Objective perceptual evaluation of speech quality test and a subjective listening test are undertaken in performance evaluation. The experimental results demonstrate better separation quality achieved by the CS approach than by the TIKR approach at the cost of computational complexity. PMID:27036243
Frühwirth-Schnatter, Sylvia
1990-01-01
In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
International Nuclear Information System (INIS)
The use of the non-invasively obtained fetal electrocardiogram (ECG) in fetal monitoring is complicated by the low signal-to-noise ratio (SNR) of ECG signals. Even after removal of the predominant interference (i.e. the maternal ECG), the SNR is generally too low for medical diagnostics, and hence additional signal processing is still required. To this end, several methods for exploiting the spatial correlation of multi-channel fetal ECG recordings from the maternal abdomen have been proposed in the literature, of which principal component analysis (PCA) and independent component analysis (ICA) are the most prominent. Both PCA and ICA, however, suffer from the drawback that they are blind source separation (BSS) techniques and as such suboptimum in that they do not consider a priori knowledge on the abdominal electrode configuration and fetal heart activity. In this paper we propose a source separation technique that is based on the physiology of the fetal heart and on the knowledge of the electrode configuration. This technique operates by calculating the spatial fetal vectorcardiogram (VCG) and approximating the VCG for several overlayed heartbeats by an ellipse. By subsequently projecting the VCG onto the long axis of this ellipse, a source signal of the fetal ECG can be obtained. To evaluate the developed technique, its performance is compared to that of both PCA and ICA and to that of augmented versions of these techniques (aPCA and aICA; PCA and ICA applied on preprocessed signals) in generating a fetal ECG source signal with enhanced SNR that can be used to detect fetal QRS complexes. The evaluation shows that the developed source separation technique performs slightly better than aPCA and aICA and outperforms PCA and ICA and has the main advantage that, with respect to aPCA/PCA and aICA/ICA, it performs more robustly. This advantage renders it favorable for employment in automated, real-time fetal monitoring applications
Method and apparatus for suppressing electron generation in a vapor source for isotope separation
International Nuclear Information System (INIS)
A system for applying accelerating forces to ionized particles of a vapor in a manner to suppress the flow of electron current from the vapor source. The accelerating forces are applied as an electric field in a configuration orthogonal to a magnetic field. The electric field is applied between one or more anodes in the plasma and one or more cathodes operated as electron emitting surfaces. The circuit for applying the electric field floats the cathodes with respect to the vapor source, thereby removing the vapor source from the circuit of electron flow through the plasma and suppressing the flow of electrons from the vapor source. The potential of other conducting structures contacting the plasma is controlled at or permitted to seek a level which further suppresses the flow of electron currents from the vapor source. Reducing the flow of electrons from the vapor source is particularly useful where the vapor is ionized with isotopic selectivity because it avoids superenergization of the vapor by the electron current
Quednau, Philipp; Trommer, Ralph; Schmidt, Lorenz-Peter
2016-03-01
Wireless transmission systems in smart metering networks share the advantage of lower installation costs due to the expandability of separate infrastructure but suffer from transmission problems. In this paper the issue of interference of wireless transmitted smart meter data with third party systems and data from other meters is investigated and an approach for solving the problem is presented. A multi-channel wireless m-bus receiver was developed to separate the desired data from unwanted interferers by spatial filtering. The according algorithms are presented and the influence of different antenna types on the spatial filtering is investigated. The performance of the spatial filtering is evaluated by extensive measurements in a realistic surrounding with several hundreds of active wireless m-bus transponders. These measurements correspond to the future environment for data-collectors as they took place in rural and urban areas with smart gas meters equipped with wireless m-bus transponders installed in almost all surrounding buildings.
International Nuclear Information System (INIS)
The Danube Delta-Black Sea region of Romania is an important wetland, and this preliminary study evaluates the significance of this region as a source of atmospheric CH4. Measurements of the mixing ratio and δ13C in CH4 are reported from air and water samples collected at eight sites in the Danube Delta. High mixing ratios of CH4 were found in air (2500-14,000 ppb) and dissolved in water samples (∼1-10 μmol L-1), demonstrating that the Danube Delta is an important natural source of CH4. The intercepts on Keeling plots of about -62 per mille show that the main source of CH4 in this region is microbial, probably resulting primarily from acetate fermentation. Atmospheric CH4 and CO data from the NOAA/ESRL (National Oceanic and Atmospheric Administration/Earth System Research Laboratory) were used to make a preliminary estimate of biogenic CH4 at the Black Sea sampling site at Constanta (BSC). These data were used to calculate ratios of CH4/CO in air samples, and using an assumed CH4/CO anthropogenic emissions ratio of 0.6, fossil fuel emissions at BSC were estimated. Biogenic CH4 emissions were then estimated by a simple mass balance approach. Keeling plots of well-mixed air from the BSC site suggested a stronger wetland source in summer and a stronger fossil fuel source in winter
International Nuclear Information System (INIS)
An alkali-metal ion source working without a store of alkali-metals is described. The alkali-metal ions are produced by evaporation of alkali salts and ionization in a low-voltage arc discharge stabilized with a noble gas plasma or in the case of small alkali-metal ion currents on the base of the well known thermic ionization at a hot tungsten wire. The source is very simple in construction and produces a stable ion current of 0.3 μA for more than 100 h. It is possible to change the ion species in a short time. This source is applicable to all SIMS equipments using mass separation for primary ions. (author)
Time-domain beamforming and blind source separation speech input in the car environment
Bourgeois, Julien
2009-01-01
The development of computer and telecommunication technologies led to a revolutioninthewaythatpeopleworkandcommunicatewitheachother.One of the results is that large amount of information will increasingly be held in a form that is natural for users, as speech in natural language. In the presented work, we investigate the speech signal capture problem, which includes the separation of multiple interfering speakers using microphone arrays. Adaptive beamforming is a classical approach which has been developed since the seventies. However it requires a double-talk detector (DTD) that interrupts th
Institute of Scientific and Technical Information of China (English)
苏宏升
2008-01-01
To make conventional Bayesian optimal classifier possess the abilities of disposing fuzzy information and realizing the automation of reasoning process, a new Bayesian optimal classifier is proposed with fuzzy information embedded. It can not only dispose fuzzy information effectively, but also retain learning properties of Bayesian optimal classifier. In addition, according to the evolution of fuzzy set theory, vague set is also imbedded into it to generate vague Bayesian optimal classifier. It can simultaneously simulate the twofold characteristics of fuzzy information from the positive and reverse directions. Further, a set pair Bayesian optimal classifier is also proposed considering the threefold characteristics of fuzzy information from the positive, reverse, and indeterminate sides. In the end, a knowledge-based artificial neural network (KBANN) is presented to realize automatic reasoning of Bayesian optimal classifier. It not only reduces the computational cost of Bayesian optimal classifier but also improves its classification learning quality.
The Distressed Brain: A Group Blind Source Separation Analysis on Tinnitus
De Ridder, Dirk; Vanneste, Sven; Congedo, Marco
2011-01-01
Background: Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress. Methodology: In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ), an electroencephalography (EEG) is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30) and h...
Degani, D.
1983-01-01
A numerical study of the conjugated problem of a separated supersonic flow field and a conductive solid wall with an embedded heat source is presented. Implicit finite-difference schemes were used to solve the two-dimensional time-dependent compressible Navier-Stokes equations and the time-dependent heat-conduction equation for the solid in both general coordinate systems. A detailed comparison between the thin-layer and Navier-Stokes models was made for steady and unsteady supersonic flow and showed insignificant differences. Steady-state and transient cases were computed and the results show that a temperature pulse at the solid-fluid interface can be used to detect the flow direction near the wall in the vicinity of separation without significant distortion of the flow field.
Directory of Open Access Journals (Sweden)
Kellermann Walter
2007-01-01
Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.
Directory of Open Access Journals (Sweden)
Gary E Strangman
Full Text Available Understanding the spatial and depth sensitivity of non-invasive near-infrared spectroscopy (NIRS measurements to brain tissue-i.e., near-infrared neuromonitoring (NIN - is essential for designing experiments as well as interpreting research findings. However, a thorough characterization of such sensitivity in realistic head models has remained unavailable. In this study, we conducted 3,555 Monte Carlo (MC simulations to densely cover the scalp of a well-characterized, adult male template brain (Colin27. We sought to evaluate: (i the spatial sensitivity profile of NIRS to brain tissue as a function of source-detector separation, (ii the NIRS sensitivity to brain tissue as a function of depth in this realistic and complex head model, and (iii the effect of NIRS instrument sensitivity on detecting brain activation. We found that increasing the source-detector (SD separation from 20 to 65 mm provides monotonic increases in sensitivity to brain tissue. For every 10 mm increase in SD separation (up to ~45 mm, sensitivity to gray matter increased an additional 4%. Our analyses also demonstrate that sensitivity in depth (S decreases exponentially, with a "rule-of-thumb" formula S=0.75*0.85(depth. Thus, while the depth sensitivity of NIRS is not strictly limited, NIN signals in adult humans are strongly biased towards the outermost 10-15 mm of intracranial space. These general results, along with the detailed quantitation of sensitivity estimates around the head, can provide detailed guidance for interpreting the likely sources of NIRS signals, as well as help NIRS investigators design and plan better NIRS experiments, head probes and instruments.
Noise-Source Separation Using Internal and Far-Field Sensors for a Full-Scale Turbofan Engine
Hultgren, Lennart S.; Miles, Jeffrey H.
2009-01-01
Noise-source separation techniques for the extraction of the sub-dominant combustion noise from the total noise signatures obtained in static-engine tests are described. Three methods are applied to data from a static, full-scale engine test. Both 1/3-octave and narrow-band results are discussed. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP). A new additional phase-angle-based discriminator for the three-signal method is also introduced.
International Nuclear Information System (INIS)
The variation of the induced radioactivity and gamma ray intensity with time for the irradiated target materials of on-line isotope separator target-source system by a proton beam with energy of 100 MeV and intensity 200 μA were calculated by LCS + CBURN code. This work will provide a reference to the design, exchange and disposal of target. The tritium was produced after irradiation by proton beams for all target materials, especially there is 131I in lead target material. (authors)
Zeeman, Grietje; Kujawa, Katarzyna; de Mes, Titia; Hernandez, Lucia; de Graaff, Marthe; Abu-Ghunmi, Lina; Mels, Adriaan; Meulman, Brendo; Temmink, Hardy; Buisman, Cees; van Lier, Jules; Lettinga, Gatze
2008-01-01
Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas production. Post-treatment of the effluent is providing recovery of phosphorus and removal of remaining COD and nitrogen. The total energy saving of the new sanitation concept amounts to 200 MJ/year in comparison with conventional sanitation, moreover 0.14 kg P/p/year and 90 litres of potential reusable water are produced. PMID:18469391
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Larsen, Bjarne;
2016-01-01
-pulping technology showed higher biodegradable material recovery, lower electricity consumption and comparable water consumption. The higher material recovery achieved with the technology was associated with greater transfer of nutrients (N and P), carbon (total and biogenic) but also heavy metals (except Pb) to the......A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed...
Directory of Open Access Journals (Sweden)
Goto Masataka
2010-01-01
Full Text Available We describe a novel query-by-example (QBE approach in music information retrieval that allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis of this approach is that the musical mood of retrieved results changes in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change in the volume balance of a query and the genre of the retrieved pieces, called genre classification shift. Such an understanding would allow us to instruct users in how to generate alternative queries without finding other appropriate pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then it allows users remix these parts to change the acoustic features that represent the musical mood of the piece. Experimental results showed that the genre classification shift was actually caused by the volume change in the vocal, guitar, and drum parts.
Bayesian exploratory factor analysis
Gabriella Conti; Sylvia Frühwirth-Schnatter; James Heckman; Rémi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identifi cation criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study c...
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Bayesian Exploratory Factor Analysis
Gabriella Conti; Sylvia Fruehwirth-Schnatter; Heckman, James J.; Remi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \\emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo s...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo st...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Carbonetto, Peter; Kisynski, Jacek; De Freitas, Nando; Poole, David L
2012-01-01
The Bayesian Logic (BLOG) language was recently developed for defining first-order probability models over worlds with unknown numbers of objects. It handles important problems in AI, including data association and population estimation. This paper extends BLOG by adopting generative processes over function spaces - known as nonparametrics in the Bayesian literature. We introduce syntax for reasoning about arbitrary collections of objects, and their properties, in an intuitive manner. By expl...
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
The separation of control variables in an H/sup /minus// ion source
International Nuclear Information System (INIS)
This paper describes a successful methodology which was used to classify a series if waveforms taken from a 100 ma H/sup /minus// ion source at Los Alamos. The series of 260 waveforms was divided into a ''training'' set and a ''test'' set. A sequence of mathematical transformations was performed on the ''training'' waveforms data and then it was subjected to discriminant analysis. The analysis generates a set of filters which will allow classification of an unknown waveform in the ''test'' set as being either stable or unstable; if stable, whether optimal or not; if not optimal, which of the six control parameters should be adjusted to bring it to an optimal condition. We have found that the probability of successful classification using this methodology is 91.5%. 3 refs., 4 figs., 2 tabs
Impact of organic polyelectrolytes on coagulation of source-separated black water.
Kozminykh, Pavlo; Heistad, Arve; Ratnaweera, Harsha C; Todt, Daniel
2016-07-01
Household wastewater is originated from common people's activities and has a potential harmful impact on the environment if discharged directly without proper treatment. Toilet wastewater or black water (BW) contains urine, faeces, toilet paper and flushing water and it contains the majority of pollutants obtained from a single household. In this study, the focus was on BW treatment using chemical methods. The main goal of current research was to define the possibility and applicability of conventional coagulants and flocculants in direct chemical treatment of vacuum-collected BW to remove particles, organic matter and phosphorous. After the definition of dosing ranges, based on the equivalent doses in conventional municipal and industrial wastewater treatment data, aluminium and iron coagulants, organic polyelectrolytes (polymers with anionic, neutral and cationic charge with different molecular weights) and their various combinations were tested using the well-known jar-test laboratory method to study aggregation and solid-liquid separation processes in raw BW. The most important process parameter during the coagulation was pH level, dependent on the type and doses of metal salts. Some side processes were found to occur while using iron-based coagulants. Dosing of either single coagulants or single polymers did not give satisfactory results, while a combination of aluminium salts and cationic polymers showed high removal rates in total suspended solids, total chemical oxygen demand and ortho-phosphates, reaching 97.8%, 92% and 98.6%, respectively, with the optimal doses of chemicals. Cationic polymers with the lowest molecular weight and highest charge density were the most efficient in combination with aluminium coagulants. PMID:26672384
Hirayama, Y.; Watanabe, Y. X.; Imai, N.; Ishiyama, H.; Jeong, S. C.; Jung, H. S.; Miyatake, H.; Oyaizu, M.; Kimura, S.; Mukai, M.; Kim, Y. H.; Sonoda, T.; Wada, M.; Huyse, M.; Kudryavtsev, Yu.; Van Duppen, P.
2016-06-01
KEK Isotope Separation System (KISS) has been developed at RIKEN to produce neutron rich isotopes with N = 126 to study the β -decay properties for application to astrophysics. The KISS is an element-selective mass-separation system which consists of an argon gas cell-based on laser ion source for atomic number selection and an ISOL mass-separation system. The argon gas cell of KISS is a key component to stop and collect the unstable nuclei produced in a multi-nucleon transfer reaction, where the isotopes of interest will be selectively ionized using laser resonance ionization. We have performed off- and on-line experiments to study the basic properties of the gas cell as well as of the KISS. We successfully extracted the laser-ionized stable 56Fe (direct implantation of a 56Fe beam into the gas cell) atoms and 198Pt (emitted from the 198Pt target by elastic scattering with a 136Xe beam) atoms from the KISS during the commissioning on-line experiments. We furthermore extracted laser-ionized unstable 199Pt atoms and confirmed that the measured half-life was in good agreement with the reported value.
Korucu, M Kemal; Kaplan, Özgür; Büyük, Osman; Güllü, M Kemal
2016-10-01
In this study, we investigate the usability of sound recognition for source separation of packaging wastes in reverse vending machines (RVMs). For this purpose, an experimental setup equipped with a sound recording mechanism was prepared. Packaging waste sounds generated by three physical impacts such as free falling, pneumatic hitting and hydraulic crushing were separately recorded using two different microphones. To classify the waste types and sizes based on sound features of the wastes, a support vector machine (SVM) and a hidden Markov model (HMM) based sound classification systems were developed. In the basic experimental setup in which only free falling impact type was considered, SVM and HMM systems provided 100% classification accuracy for both microphones. In the expanded experimental setup which includes all three impact types, material type classification accuracies were 96.5% for dynamic microphone and 97.7% for condenser microphone. When both the material type and the size of the wastes were classified, the accuracy was 88.6% for the microphones. The modeling studies indicated that hydraulic crushing impact type recordings were very noisy for an effective sound recognition application. In the detailed analysis of the recognition errors, it was observed that most of the errors occurred in the hitting impact type. According to the experimental results, it can be said that the proposed novel approach for the separation of packaging wastes could provide a high classification performance for RVMs. PMID:27378630
Study of a single-charged ions ECR source matching of the extracted beam to an isotope separator
International Nuclear Information System (INIS)
A new ECR ion-source has been designed and studied for single-charged ion beams. A very stable regime has been obtained with an ion-source made of two identical stages in cascade. The RF power supplies consist of two 2.45 GHZ magnetrons. The discharge chamber is made of two coaxial Pyrex tubes. The external one ensures vacuum and HT insulation. The tubes are aligned inside the two multi-mode cavities axially limited by three magnetic coils. The ion beam is extracted at 20 kV and focused with electric lenses. For argon and xenon, 1 mA single-charged ion currents have been extracted. The influence of various parameters has been progressively achieved with a set-up including a 600 analyzing magnet and with the 1200 on-line isotope separator at SARA. From emittances and images observed it appears difficult to compensate charge space effects. Suggestions and future developments are proposed to improve qualities of the isotopic separation
Naroznova, Irina; Møller, Jacob; Larsen, Bjarne; Scheutz, Charlotte
2016-04-01
A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered in the biopulp. The biochemical methane potential for the biopulp was 469±7mL CH4/g ash-free mass. Moreover, all Danish and European Union requirements regarding the content of hazardous substances in biomass intended for land application were fulfilled. Compared to other pre-treatment alternatives, the screw-pulping technology showed higher biodegradable material recovery, lower electricity consumption and comparable water consumption. The higher material recovery achieved with the technology was associated with greater transfer of nutrients (N and P), carbon (total and biogenic) but also heavy metals (except Pb) to the produced biomass. The data generated in this study could be used for the environmental assessment of the technology and thus help in selecting the best pre-treatment technology for source separated organic household waste. PMID:26868847
Zhu, G. F.; Li, X.; Su, Y. H.; Zhang, K.; Bai, Y.; Ma, J. Z.; Li, C. B.; Hu, X. L.; He, J. H.
2014-07-01
Based on direct measurements of half-hourly canopy evapotranspiration (ET; W m-2) using the eddy covariance (EC) system and daily soil evaporation (E; mm day-1) using microlysimeters over a crop ecosystem in arid northwestern China from 27 May to 14 September in 2013, a Bayesian method was used to simultaneously parameterize the soil surface and canopy resistances in the Shuttleworth-Wallace (S-W) model. Four of the six parameters showed relatively larger uncertainty reductions (> 50%), and their posterior distributions became approximately symmetric with distinctive modes. There was a moderately good agreement between measured and simulated values of half-hourly ET and daily E with a linear regression being y = 0.84 x + 0.18 (R2 = 0.83) and y = 1.01 x + 0.01 (R2 = 0.82), respectively. The causes of underestimations of ET by the S-W model was possibly attributed to the microscale advection, which can contribute an added energy in the form of downward sensible heat fluxes to the ET process. Therefore, the advection process should be taken into account in simulating ET in heterogeneous land surfaces. Also, underestimations were observed on or shortly after rainy days, which may be due to direct evaporation of liquid water intercepted in the canopy. Thus, the canopy interception model should be coupled to the S-W model in the long-term ET simulation.
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Heimann, I.; Bright, V. B.; McLeod, M. W.; Mead, M. I.; Popoola, O. A. M.; Stewart, G. B.; Jones, R. L.
2015-07-01
To carry out detailed source attribution for air quality assessment it is necessary to distinguish pollutant contributions that arise from local emissions from those attributable to non-local or regional emission sources. Frequently this requires the use of complex models and inversion methods, prior knowledge or assumptions regarding the pollution environment. In this paper we demonstrate how high spatial density and fast response measurements from low-cost sensor networks may facilitate this separation. A purely measurement-based approach to extract underlying pollution levels (baselines) from the measurements is presented exploiting the different relative frequencies of local and background pollution variations. This paper shows that if high spatial and temporal coverage of air quality measurements are available, the different contributions to the total pollution levels, namely the regional signal as well as near and far field local sources, can be quantified. The advantage of using high spatial resolution observations, as can be provided by low-cost sensor networks, lies in the fact that no prior assumptions about pollution levels at individual deployment sites are required. The methodology we present here, utilising measurements of carbon monoxide (CO), has wide applicability, including additional gas phase species and measurements obtained using reference networks. While similar studies have been performed, this is the first study using networks at this density, or using low cost sensor networks.
Bayesian inference of mass segregation of open clusters
Shao, Zhengyi; Chen, Li; Lin, Chien-Cheng; Zhong, Jing; Hou, Jinliang
2015-08-01
Based on the Bayesian inference (BI) method, the mixture-modeling approach is improved to combine all kinematic data, including the coordinative position, proper motion (PM) and radial velocity (RV), to separate the motion of the cluster from field stars in its area, as well as to describe the intrinsic kinematic status. Meanwhile, the membership probabilities of individual stars are determined as by product results. This method has been testified by simulation of toy models and it was found that the joint usage of multiple kinematic data can significantly reduce the missing rate of membership determination, say from ~15% for single data type to 1% for using all position, proper motion and radial velocity data.By combining kinematic data from multiple sources of photometric and redshift surveys, such as WIYN and APOGEE, M67 and NGC188 are revisited. Mass segregation is identified clearly for both of these two old open clusters, either in position or in PM spaces, since the Bayesian evidence (BE) of the model, which includes the segregation parameters, is much larger than that without it. The ongoing work is applying this method to the LAMOST released data which contains a large amount of RVs cover ~200 nearby open clusters. If the coming GAIA data can be used, the accuracy of tangential velocity will be largely improved and the intrinsic kinematics of open clusters can be well investigated, though they are usually less than 1 km/s.
DEFF Research Database (Denmark)
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-01-01
composition in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material......This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated......) and material degradability (BMP from laboratory incubation tests divided by TBMP) were expressed. Moreover, the degradability of lignocellulose biofibres (the share of volatile lignocellulose biofibre solids degraded in laboratory incubation tests) was calculated. Finally, BMP for average SSOHW...
Jimenez, Jose; Bott, Charles; Love, Nancy; Bratby, John
2015-12-01
Municipal wastewater contains a mixture of brown (feces and toilet paper), yellow (urine), and gray (kitchen, bathroom and wash) waters. Urine contributes approximately 70-80% of the nitrogen (N), 50-70% of the phosphorus (P) load and 60-70% of the pharmaceutical residues in normal domestic sewage. This study evaluated the impact of different levels of source separation of urine on an existing biological nutrient removal (BNR) process. A process model of an existing biological nutrient removal (BNR) plant was used. Increasing the amount of urine diverted from the water reclamation facilities, has little impact on effluent ammonia (NH₃-N) concentration, but effluent nitrate (NO₃-N) concentration decreases. If nitrification is necessary then no reduction in the sludge age can be realized. However, a point is reached where the remaining influent nitrogen load matches the nitrogen requirements for biomass growth, and no residual nitrogen needs to be nitrified. That allows a significant reduction in sludge age, implying reduced process volume requirements. In situations where nitrification is required, lower effluent nitrate (NO₃-N) concentrations were realized due to both the lower influent nitrogen content in the wastewater and a more favorable nitrogen-to-carbon ratio for denitrification. The external carbon requirement for denitrification decreases as the urine separation efficiency increases due to the lower influent nitrogen content in the wastewater and a more favorable nitrogen-to-carbon ratio for denitrification. The effluent phosphorus concentration decreases when the amount of urine sent to water reclamation facilities is decreased due to lower influent phosphorus concentrations. In the case of chemical phosphate removal, urine separation reduces the amount of chemicals required. PMID:26652123
Pires, Carlos; Ribeiro, Andreia
2016-04-01
An efficient nonlinear method of statistical source separation of space-distributed non-Gaussian distributed data is proposed. The method relies in the so called Independent Subspace Analysis (ISA), being tested on a long time-series of the stream-function field of an atmospheric quasi-geostrophic 3-level model (QG3) simulating the winter's monthly variability of the Northern Hemisphere. ISA generalizes the Independent Component Analysis (ICA) by looking for multidimensional and minimally dependent, uncorrelated and non-Gaussian distributed statistical sources among the rotated projections or subspaces of the multivariate probability distribution of the leading principal components of the working field whereas ICA restrict to scalar sources. The rationale of that technique relies upon the projection pursuit technique, looking for data projections of enhanced interest. In order to accomplish the decomposition, we maximize measures of the sources' non-Gaussianity by contrast functions which are given by squares of nonlinear, cross-cumulant-based correlations involving the variables spanning the sources. Therefore sources are sought matching certain nonlinear data structures. The maximized contrast function is built in such a way that it provides the minimization of the mean square of the residuals of certain nonlinear regressions. The issuing residuals, followed by spherization, provide a new set of nonlinear variable changes that are at once uncorrelated, quasi-independent and quasi-Gaussian, representing an advantage with respect to the Independent Components (scalar sources) obtained by ICA where the non-Gaussianity is concentrated into the non-Gaussian scalar sources. The new scalar sources obtained by the above process encompass the attractor's curvature thus providing improved nonlinear model indices of the low-frequency atmospheric variability which is useful since large circulation indices are nonlinearly correlated. The non-Gaussian tested sources (dyads and
Directory of Open Access Journals (Sweden)
Duarte L.T.
2014-03-01
Full Text Available The development of chemical sensor arrays based on Blind Source Separation (BSS provides a promising solution to overcome the interference problem associated with Ion-Selective Electrodes (ISE. The main motivation behind this new approach is to ease the time-demanding calibration stage. While the first works on this problem only considered the case in which the ions under analysis have equal valences, the present work aims at developing a BSS technique that works when the ions have different charges. In this situation, the resulting mixing model belongs to a particular class of nonlinear systems that have never been studied in the BSS literature. In order to tackle this sort of mixing process, we adopted a recurrent network as separating system. Moreover, concerning the BSS learning strategy, we develop a mutual information minimization approach based on the notion of the differential of the mutual information. The method works requires a batch operation, and, thus, can be used to perform off-line analysis. The validity of our approach is supported by experiments where the mixing model parameters were extracted from actual data.
Jaatinen, Sanna T; Palmroth, Marja R T; Rintala, Jukka A; Tuhkanen, Tuula A
2016-09-01
The behaviour of pharmaceuticals related to the human immunodeficiency virus treatment was studied in the liquid phase of source-separated urine during six-month storage at 20°C. Six months is the recommended time for hygienization and use of urine as fertilizer. Compounds were spiked in urine as concentrations calculated to appear in urine. Assays were performed with separate compounds and as therapeutic groups of antivirals, antibiotics and anti-tuberculotics. In addition, urine was amended either with faeces or urease inhibitor. The pharmaceutical concentrations were monitored from filtered samples with solid phase extraction and liquid chromatography. The concentration reductions of the studied compounds as such or with amendments ranged from less than 1% to more than 99% after six-month storage. The reductions without amendments were 41.9-99% for anti-tuberculotics; <52% for antivirals (except with 3TC 75.6%) and <50% for antibiotics. In assays with amendments, the reductions were all <50%. Faeces amendment resulted in similar or lower reduction than without it even though bacterial activity should have increased. The urease inhibitor prevented ureolysis and pH rise but did not affect pharmaceutical removal. In conclusion, removal during storage might not be enough to reduce risks associated with the studied pharmaceuticals, in which case other feasible treatment practises or urine utilization means should be considered. PMID:26804243
Meng, Ying-ying; Feng, Cang; Li, Tian; Wang, Ling
2009-12-01
Dry-weather flow quantity and quality of three representative separate storm sewer systems in Shanghai-H, G, N were studied. Based on survey of operating status of the pumping stations as well as characteristics of the drainage systems, it was obtained that the interception sewage volumes per unit area in the three systems were 3610 m3/(km2 x d), 1550 m3/(km2 x d), 2970 m3/(km2 x d) respectively; the sanitary wastewater included accounted for 25%, 85% and 71% respectively; the interception volume of H was mainly composed of infiltrated underground water, so the dry-weather flow pollution was slighter, and the interception volumes of G, N were both mainly composed of sanitary wastewater, so the dry-weather which were flow pollution was relatively serious. The water characteristics of potential illicit discharge sources of dry-weather which were flow-grey water, black water and underground water were preliminarily explored, so that treating three parameters-LAS/ NH4+ -N, NH4+ -N/K, Mg/K as tracer parameters of grey water, black water and underground water was put forward. Moreover, the water characteristics of grey water and sanitary wastewater including black water were summarized: the feature of grey water was LAS/NH4+ -N > 0.2, NH4+ -N/K 1. Based on the above, the applications of flow chart method and CMBM method in dry-weather flow detection of monitored storm systems were preliminarily discussed, and the results were basically same as that obtained in flow quantity and quality comprehensive analysis. The research results and methods can provide guidance for analysis and diagnosis of dry-weather flow sources and subsequent reconstruction projects in similar separate storm sewer systems at home. PMID:20187382
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Loredo, T J
2004-01-01
I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...
Bayesian and frequentist inequality tests
David M. Kaplan; Zhuo, Longhao
2016-01-01
Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...
Disentangling Overlapping Astronomical Sources using Spatial and Spectral Information
Jones, David E; van Dyk, David A
2014-01-01
We present a powerful new algorithm that combines both spatial information (event locations and the point spread function) and spectral information (photon energies) to separate photons from overlapping sources. We use Bayesian statistical methods to simultaneously infer the number of overlapping sources, to probabilistically separate the photons among the sources, and to fit the parameters describing the individual sources. Using the Bayesian joint posterior distribution, we are able to coherently quantify the uncertainties associated with all these parameters. The advantages of combining spatial and spectral information are demonstrated through a simulation study. The utility of the approach is then illustrated by analysis of observations of FK Aqr and FL Aqr with the XMM-Newton Observatory and the central region of the Orion Nebula Cluster with the Chandra X-ray Observatory.
Disentangling Overlapping Astronomical Sources Using Spatial and Spectral Information
Jones, David E.; Kashyap, Vinay L.; van Dyk, David A.
2015-08-01
We present a powerful new algorithm that combines both spatial information (event locations and the point-spread function) and spectral information (photon energies) to separate photons from overlapping sources. We use Bayesian statistical methods to simultaneously infer the number of overlapping sources, to probabilistically separate the photons among the sources, and to fit the parameters describing the individual sources. Using the Bayesian joint posterior distribution, we are able to coherently quantify the uncertainties associated with all these parameters. The advantages of combining spatial and spectral information are demonstrated through a simulation study. The utility of the approach is then illustrated by analysis of observations of FK Aqr and FL Aqr with the XMM-Newton Observatory and the central region of the Orion Nebula Cluster with the Chandra X-ray Observatory.
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the...... corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Explanation mode for Bayesian automatic object recognition
Hazlett, Thomas L.; Cofer, Rufus H.; Brown, Harold K.
1992-09-01
One of the more useful techniques to emerge from AI is the provision of an explanation modality used by the researcher to understand and subsequently tune the reasoning of an expert system. Such a capability, missing in the arena of statistical object recognition, is not that difficult to provide. Long standing results show that the paradigm of Bayesian object recognition is truly optimal in a minimum probability of error sense. To a large degree, the Bayesian paradigm achieves optimality through adroit fusion of a wide range of lower informational data sources to give a higher quality decision--a very 'expert system' like capability. When various sources of incoming data are represented by C++ classes, it becomes possible to automatically backtrack the Bayesian data fusion process, assigning relative weights to the more significant datums and their combinations. A C++ object oriented engine is then able to synthesize 'English' like textural description of the Bayesian reasoning suitable for generalized presentation. Key concepts and examples are provided based on an actual object recognition problem.
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...
Czech Academy of Sciences Publication Activity Database
Krejsa, Jiří; Věchet, S.
Bratislava: Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution as a...
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Kernel Approximate Bayesian Computation for Population Genetic Inferences
Nakagome, Shigeki; Fukumizu, Kenji; Mano, Shuhei
2012-01-01
Approximate Bayesian computation (ABC) is a likelihood-free approach for Bayesian inferences based on a rejection algorithm method that applies a tolerance of dissimilarity between summary statistics from observed and simulated data. Although several improvements to the algorithm have been proposed, none of these improvements avoid the following two sources of approximation: 1) lack of sufficient statistics: sampling is not from the true posterior density given data but from an approximate po...
Institute of Scientific and Technical Information of China (English)
陈霞; 田荣艳
2012-01-01
通过对火工分离装置水下分离噪声产生机理的分析和研究,在满足分离裕度的情况下,研究一种专用火工分离装置,探寻装药量与分离噪声声源级的关系,进行水下噪声对比测试,其研究结果对水中兵器水下分离降嗓设计提供依据.%Based on the analysis and study for the noise generation mechanism of pyrotechnically actuated separation devices which separate underwater, when meet the case of separated margin, study a special pyrotechnical separation device to explore the relationship between the charge weight and the sound source level,and develop the tests for underwater noise. The results provide the basis for noising-reducing of underwater weapons.
Jiménez-Gonzalez, Aída; James, Christopher J.
2013-11-01
Today, it is generally accepted that current methods for biophysical antenatal surveillance do not facilitate a comprehensive and reliable assessment of foetal well-being and thus, that continuing research into alternative methods is necessary to improve antenatal monitoring procedures. Here, attention has been paid to the abdominal phonogram, a signal that is recorded by positioning an acoustic sensor on the maternal womb and contains valuable information about foetal status, but which is hidden by maternal and environmental sources. To recover such information, this work describes single-channel independent component analysis (SCICA) as an alternative signal processing approach for analyzing the abdominal phonogram. The approach, based on the method of delays, the Temporal Decorrelation Source SEParation implementation (TDSEP) of Independent Components Analysis (ICA), and an automatic grouping algorithm, has managed to successfully retrieve estimates of: (1) the foetal cardiac activity (in the form of the foetal phonocardiogram, FPCG), (2) the maternal cardiovascular activity (in the form of the maternal phonocardiogram, MPCG, and/or pulse wave), (3) the maternal respiratory activity (in the form of the maternal respirograma, MResp), and (4) noise (N). These results have been obtained from a dataset of 25 single-channel phonograms and point at the possibilities of using SCICA to address a fundamental problem faced in antenatal surveillance, i.e. the extraction of information from a non-invasive signal like the abdominal phonogram. Future work will test the possibility of using SCICA to recover information regarding the foetal breathing movements (FBM), another physiological parameter of interest in foetal surveillance.
State Information in Bayesian Games
Cuff, Paul
2009-01-01
Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.
Isomer separation of $^{70g}Cu$ and $^{70m}Cu$ with a resonance ionization laser ion source
Köster, U; Mishin, V I; Weissman, L; Huyse, M; Kruglov, K; Müller, W F; Van Duppen, P; Van Roosbroeck, J; Thirolf, P G; Thomas, H C; Weisshaar, D W; Schulze, W; Borcea, R; La Commara, M; Schatz, H; Schmidt, K; Röttger, S; Huber, G; Sebastian, V; Kratz, K L; Catherall, R; Georg, U; Lettry, Jacques; Oinonen, M; Ravn, H L; Simon, H
2000-01-01
Radioactive copper isotopes were ionized with the resonance ionization laser ion source at the on-line isotope separator ISOLDE (CERN). Using the different hyperfine structure in the 3d/sup 10/ 4s /sup 2/S/sub 1/2/-3d/sup 10/ 4p /sup 2/P/sub 1/2//sup 0/ transition the low- and high-spin isomers of /sup 70/Cu were selectively enhanced by tuning the laser wavelength. The light was provided by a narrow-bandwidth dye laser pumped by copper vapor lasers and frequency doubled in a BBO crystal. The ground state to isomeric state intensity ratio could be varied by a factor of 30, allowing to assign gamma transitions unambiguously to the decay of the individual isomers. It is shown that the method can also be used to determine magnetic moments. In a first experiment for the 1/sup +/ ground state of /sup 70/Cu a magnetic moment of (+)1.8(3) mu /sub N/ and for the high-spin isomer of /sup 70/Cu a magnetic moment of (+or-)1.2(3) mu /sub N/ could be deduced. (20 refs).
Graf, John; Taylor, Dale; Martinez, James
2014-01-01
]. Combined with a mechanical compressor, a Solid Electrolyte Oxygen Separator (SEOS) should be capable of producing ABO grade oxygen at pressures >2400 psia, on the space station. Feasibility tests using a SEOS integrated with a mechanical compressor identified an unexpected contaminant in the oxygen: water vapour was found in the oxygen product, sometimes at concentrations higher than 40 ppm (the ABO limit for water vapour is 7 ppm). If solid electrolyte membranes are really "infinitely selective" to oxygen as they are reported to be, where did the water come from? If water is getting into the oxygen, what other contaminants might get into the oxygen? Microscopic analyses of wafers, welds, and oxygen delivery tubes were performed in an attempt to find the source of the water vapour contamination. Hot and cold pressure decay tests were performed. Measurements of water vapour as a function of O2 delivery rate, O2 delivery pressure, and process air humidity levels were the most instructive in finding the source of water contamination (Fig 3). Water contamination was directly affected by oxygen delivery rate (doubling the oxygen production rate cut the water level in half). Water was affected by process air humidity levels and delivery pressure in a way that indicates the water was diffusing into the oxygen delivery system.
Liao, Wei; Hua, Xue-Ming; Zhang, Wang; Li, Fang
2014-05-01
In the present paper, the authors calculated the plasma's peak electron temperatures under different heat source separation distance in laser- pulse GMAW hybrid welding based on Boltzmann spectrometry. Plasma's peak electron densities under the corresponding conditions were also calculated by using the Stark width of the plasma spectrum. Combined with high-speed photography, the effect of heat source separation distance on electron temperature and electron density was studied. The results show that with the increase in heat source separation distance, the electron temperatures and electron densities of laser plasma did not changed significantly. However, the electron temperatures of are plasma decreased, and the electron densities of are plasma first increased and then decreased. PMID:25095401
International Nuclear Information System (INIS)
This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Bayesian Magic in Asteroseismology
Kallinger, T.
2015-09-01
Only a few years ago asteroseismic observations were so rare that scientists had plenty of time to work on individual data sets. They could tune their algorithms in any possible way to squeeze out the last bit of information. Nowadays this is impossible. With missions like MOST, CoRoT, and Kepler we basically drown in new data every day. To handle this in a sufficient way statistical methods become more and more important. This is why Bayesian techniques started their triumph march across asteroseismology. I will go with you on a journey through Bayesian Magic Land, that brings us to the sea of granulation background, the forest of peakbagging, and the stony alley of model comparison.
Bayesian Nonparametric Graph Clustering
Banerjee, Sayantan; Akbani, Rehan; Baladandayuthapani, Veerabhadran
2015-01-01
We present clustering methods for multivariate data exploiting the underlying geometry of the graphical structure between variables. As opposed to standard approaches that assume known graph structures, we first estimate the edge structure of the unknown graph using Bayesian neighborhood selection approaches, wherein we account for the uncertainty of graphical structure learning through model-averaged estimates of the suitable parameters. Subsequently, we develop a nonparametric graph cluster...
Approximate Bayesian recursive estimation
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav
2014-01-01
Roč. 285, č. 1 (2014), s. 100-111. ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf
Bayesian Benchmark Dose Analysis
Fang, Qijun; Piegorsch, Walter W.; Barnes, Katherine Y.
2014-01-01
An important objective in environmental risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indee...
Bayesian Generalized Rating Curves
Helgi Sigurðarson 1985
2014-01-01
A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...
Heteroscedastic Treed Bayesian Optimisation
Assael, John-Alexander M.; Wang, Ziyu; Shahriari, Bobak; De Freitas, Nando
2014-01-01
Optimising black-box functions is important in many disciplines, such as tuning machine learning models, robotics, finance and mining exploration. Bayesian optimisation is a state-of-the-art technique for the global optimisation of black-box functions which are expensive to evaluate. At the core of this approach is a Gaussian process prior that captures our belief about the distribution over functions. However, in many cases a single Gaussian process is not flexible enough to capture non-stat...
Efficient Bayesian Phase Estimation
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.
Brody, Samuel; Lapata, Mirella
2009-01-01
Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...
Bayesian Neural Word Embedding
Barkan, Oren
2016-01-01
Recently, several works in the domain of natural language processing presented successful methods for word embedding. Among them, the Skip-gram (SG) with negative sampling, known also as Word2Vec, advanced the state-of-the-art of various linguistics tasks. In this paper, we propose a scalable Bayesian neural word embedding algorithm that can be beneficial to general item similarity tasks as well. The algorithm relies on a Variational Bayes solution for the SG objective and a detailed step by ...
Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory
2016-04-01
Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Unbounded Bayesian Optimization via Regularization
Shahriari, Bobak; Bouchard-Côté, Alexandre; De Freitas, Nando
2015-01-01
Bayesian optimization has recently emerged as a popular and efficient tool for global optimization and hyperparameter tuning. Currently, the established Bayesian optimization practice requires a user-defined bounding box which is assumed to contain the optimizer. However, when little is known about the probed objective function, it can be difficult to prescribe such bounds. In this work we modify the standard Bayesian optimization framework in a principled way to allow automatic resizing of t...
Bayesian optimization for materials design
Frazier, Peter I.; Wang, Jialei
2015-01-01
We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...
Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte
2016-04-01
This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated and then, secondly, laboratory analyses for eight organic material fractions comprising Danish SSOHW were conducted. No data were found in the literature that fully covered the objectives of the present study. Based on laboratory analyses, all fractions were assigned according to their specific properties in relation to BMP, protein content, lipids, lignocellulose biofibres and easily degradable carbohydrates (carbohydrates other than lignocellulose biofibres). The three components in lignocellulose biofibres, i.e. lignin, cellulose and hemicellulose, were differentiated, and theoretical BMP (TBMP) and material degradability (BMP from laboratory incubation tests divided by TBMP) were expressed. Moreover, the degradability of lignocellulose biofibres (the share of volatile lignocellulose biofibre solids degraded in laboratory incubation tests) was calculated. Finally, BMP for average SSOHW composition in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material degradability of material fractions such as vegetation waste, moulded fibres, animal straw, dirty paper and dirty cardboard, however, was constrained by lignin content. BMP for overall SSOHW (untreated) was 404mL CH4 per g VS, which might increase if the relative content of material fractions, such as animal and vegetable food waste, kitchen tissue and dirty paper in the waste, becomes larger. PMID:26878771
A high-resolution direction-of-arrival estimation based on Bayesian method
Institute of Scientific and Technical Information of China (English)
HUANG Jianguo; SUN Yi; XU Pu; LU Ying; LIU Kewei
2004-01-01
A Bayesian high-resolution direction-of-arrival (DOA) estimator is proposed based on the maximum a posteriori principle. The statistical performance of the Bayesian highresolution DOA estimator is also investigated. Comparison with MUSIC and Maximum likelihood estimator (MLE) shows that the Bayesian method has higher resolution and more accurate estimates for either incoherent or coherent sources. It is also more robust in the case of low SNR.
Institute of Scientific and Technical Information of China (English)
谭清磊; 陈国明; 付建民
2012-01-01
高含硫集气站中,井口分离器风险分析是集气站安全运行管理的重要环节.传统的因果图风险评价分析方法具有简洁直观,逻辑性强的优点,但具有一定的局限性.贝叶斯网络是一种较新的系统风险分析方法,能较好地表达变量之间的不确性关系且具有双向不确定性推理能力,但不如前者形象直观.采用因果图和贝叶斯网络对分离器液位过低事故进行分析并对比,充分利用两者的优点.结果表明,利用两种方法可以更好地对分离器进行风险分析.%The present paper is aimed at introducing the combination of cause-effect diagram and Bayesian network in hoping to improve the risk assessment of the wellhead separator in high-sulfur natural gas gathering station. As is known, conventional evaluation methods, such as the cause-effect diagram, are likely to produce ineffective and inaccurate assessment, due to their limitations in assessing events binary states and necessary logical relationship of the problems involved , due to their being too simple in structure and intuitive in slow reaction. Since Bayesian network is a kind of rather new risk analysis method, it has made it possible to better express the uncertainty a-mong the variables, reason out the two-way uncertainty, and describe multi-states of the events more clearly. Moreover, it can help to produce an image more accurately than before. This paper would like to analyze low liquid-level separator by cause-effect diagram via Bayesian network and then make a comparison between the results. What is more, we have carefully studied the advantages of the above-said methods. To be sure, first of all, we have simulated the accidents caused by the low liquid level of the* separator via the cause-effect diagram. Next, we have tested and analyzed the wellhead separator security barrier failure and the likely reasons of accident cropping-up so as to boil down the likely causes to the minimum set. In case
Energy Technology Data Exchange (ETDEWEB)
Karim Ghani, Wan Azlina Wan Ab., E-mail: wanaz@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Rusli, Iffah Farizan, E-mail: iffahrusli@yahoo.com [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Biak, Dayang Radiah Awang, E-mail: dayang@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia); Idris, Azni, E-mail: azni@eng.upm.edu.my [Department of Chemical and Environmental Engineering, Faculty of Engineering, University Putra Malaysia, 43400 Serdang, Selangor Darul Ehsan (Malaysia)
2013-05-15
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and
International Nuclear Information System (INIS)
Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and
Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni
2013-05-01
Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes. PMID:23415709
Ma, Fangbing; Li, Chunhui; Wang, Xuan; Yang, Zhifeng; Sun, Chengchun; Liang, Peiyu
2014-06-01
The Danjiangkou Reservoir is the water source for the middle route of the South-to-North Water Diversion Project in China. Thus, its water quality status is of great concern. Five water quality indicators (dissolved oxygen, permanganate index, ammonia nitrogen, total nitrogen, and total phosphorus), were measured at three monitoring sites (the Danjiangkou Reservoir dam, the Hejiawan and the Jiangbei bridge), to investigate changing trends, and spatiotemporal characteristics of water quality in the Danjiangkou Reservoir area from January 2006 to May 2012. We then applied a Bayesian statistical method to evaluate the water quality comprehensively. The normal distribution sampling method was used to calculate likelihood, and the entropy weight method was used to determine indicator weights for variables of interest in to the study. The results indicated that concentrations of all five indicators increased during the last six years. In addition, the water quality in the reservoir was worse during the wet season (from May to October), than during the dry season (from November to April of the next year). Overall, the probability of the water's belonging to quality category of type II, according to environmental quality standards for surface water in China, was 27.7%-33.7%, larger than that of its belonging to the other four water quality types. The increasing concentrations of nutrients could result in eutrophication of the Danjiangkou Reservoir. This method reduced the subjectivity that is commonly associated with determining indicator weights and artificial classifications, achieving more reliable results. These results indicate that it is important for the interbasin water diversion project to implement integrated water quality management in the Danjiangkou Reservoir area.
Bayesian parameter estimation for effective field theories
Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A
2015-01-01
We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Bayesian parameter estimation for effective field theories
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Applications of Bayesian spectrum representation in acoustics
Botts, Jonathan M.
framework. The application to reflection data is useful for representing frequency-dependent impedance boundaries in finite difference acoustic simulations. Furthermore, since the filter transfer function is a parametric model, it can be modified to incorporate arbitrary frequency weighting and account for the band-limited nature of measured reflection spectra. Finally, the model is modified to compensate for dispersive error in the finite difference simulation, from the filter design process. Stemming from the filter boundary problem, the implementation of pressure sources in finite difference simulation is addressed in order to assure that schemes properly converge. A class of parameterized source functions is proposed and shown to offer straightforward control of residual error in the simulation. Guided by the notion that the solution to be approximated affects the approximation error, sources are designed which reduce residual dispersive error to the size of round-off errors. The early part of a room impulse response can be characterized by a series of isolated plane waves. Measured with an array of microphones, plane waves map to a directional response of the array or spatial intensity map. Probabilistic inversion of this response results in estimates of the number and directions of image source arrivals. The model-based inversion is shown to avoid ambiguities associated with peak-finding or inspection of the spatial intensity map. For this problem, determining the number of arrivals in a given frame is critical for properly inferring the state of the sound field. This analysis is effectively compression of the spatial room response, which is useful for analysis or encoding of the spatial sound field. Parametric, model-based formulations of these problems enhance the solution in all cases, and a Bayesian interpretation provides a principled approach to model comparison and parameter estimation. v
Ri, Yong-Wu; Im, Song-Jin
2014-01-01
The modified Beer-Lambert law (MBL) and the spatially resolved spectroscopy are used to measure the tissue oxidation in muscles and brains by the continuous wave near-infrared spectroscopy. The spatially resolved spectroscopy predicts the change in the concentration of the absorber by measuring the slope of attenuation data according to the separation and calculating the absorption coefficients of tissue on the basis of the slop in attenuation at the separation distance satisfying the linearity of this slop. This study analyzed the appropriate source-detector separation distance by using the diffuse approximation resolution for photon migration when predicting the absorption coefficient by the spatially resolved spectroscopy on the basis of the reflective image of the tissue. We imagine the 3 dimensional attenuation image with the absorption coefficient, reduced scattering coefficient and separation distance as its axes and obtained the attenuation data cube by calculating the attenuation on a certain interva...
Zou, Yonghong; Wang, Lixia; Christensen, Erik R
2015-10-01
This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). PMID:26208321
Bayesian Inference of Kinematics and Memberships of Open Cluster
Shao, Z. Y.; Chen, L.; Zhong, J.; Hou, J. L.
2014-07-01
Based on the Bayesian Inference (BI) method, the Multiple-modelling approach is improved to combine coordinative positions, proper motions (PM) and radial velocities (RV), to separate the motion of the open cluster from field stars, as well as to describe the intrinsic kinematic status of the cluster.
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Decentralized Distributed Bayesian Estimation
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Sečkárová, Vladimíra
Praha: ÚTIA AVČR, v.v.i, 2011 - (Janžura, M.; Ivánek, J.). s. 16-16 [7th International Workshop on Data–Algorithms–Decision Making. 27.11.2011-29.11.2011, Mariánská] R&D Projects: GA ČR 102/08/0567; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : estimation * distributed estimation * model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/AS/dedecius-decentralized distributed bayesian estimation.pdf
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Computationally efficient Bayesian tracking
Aughenbaugh, Jason; La Cour, Brian
2012-06-01
In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.
Improved iterative Bayesian unfolding
D'Agostini, G
2010-01-01
This paper reviews the basic ideas behind a Bayesian unfolding published some years ago and improves their implementation. In particular, uncertainties are now treated at all levels by probability density functions and their propagation is performed by Monte Carlo integration. Thus, small numbers are better handled and the final uncertainty does not rely on the assumption of normality. Theoretical and practical issues concerning the iterative use of the algorithm are also discussed. The new program, implemented in the R language, is freely available, together with sample scripts to play with toy models.
Dongliang Zhang; Guangqing Huang; Xiaoling Yin; Qinghua Gong
2015-01-01
Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perc...
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
A Bayesian Matrix Factorization Model for Relational Data
Singh, Ajit P
2012-01-01
Relational learning can be used to augment one data source with other correlated sources of information, to improve predictive accuracy. We frame a large class of relational learning problems as matrix factorization problems, and propose a hierarchical Bayesian model. Training our Bayesian model using random-walk Metropolis-Hastings is impractically slow, and so we develop a block Metropolis- Hastings sampler which uses the gradient and Hessian of the likelihood to dynamically tune the proposal. We demonstrate that a predictive model of brain response to stimuli can be improved by augmenting it with side information about the stimuli.
Adaptive Dynamic Bayesian Networks
Energy Technology Data Exchange (ETDEWEB)
Ng, B M
2007-10-26
A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.
Bayesian analysis toolkit - BAT
International Nuclear Information System (INIS)
Statistical treatment of data is an essential part of any data analysis and interpretation. Different statistical methods and approaches can be used, however the implementation of these approaches is complicated and at times inefficient. The Bayesian analysis toolkit (BAT) is a software package developed in C++ framework that facilitates the statistical analysis of the data using Bayesian theorem. The tool evaluates the posterior probability distributions for models and their parameters using Markov Chain Monte Carlo which in turn provide straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as simulated annealing, allow extraction of the global mode of the posterior. BAT sets a well-tested environment for flexible model definition and also includes a set of predefined models for standard statistical problems. The package is interfaced to other software packages commonly used in high energy physics, such as ROOT, Minuit, RooStats and CUBA. We present a general overview of BAT and its algorithms. A few physics examples are shown to introduce the spectrum of its applications. In addition, new developments and features are summarized.
Using literature and data to learn Bayesian networks as clinical models of ovarian tumors
DEFF Research Database (Denmark)
Antal, P.; Fannes, G.; Timmerman, D.; Moreau, Yves; Moor, B.
2004-01-01
Thanks to its increasing availability, electronic literature has become a potential source of information for the development of complex Bayesian networks (BN), when human expertise is missing or data is scarce or contains much noise. This opportunity raises the question of how to integrate infor...... performance of a Bayesian network for the classification of ovarian tumors from clinical data....
Modelling LGD for unsecured retail loans using Bayesian methods
Katarzyna Bijak; Thomas, Lyn C
2015-01-01
Loss Given Default (LGD) is the loss borne by the bank when a customer defaults on a loan. LGD for unsecured retail loans is often found difficult to model. In the frequentist (non-Bayesian) two-step approach, two separate regression models are estimated independently, which can be considered potentially problematic when trying to combine them to make predictions about LGD. The result is a point estimate of LGD for each loan. Alternatively, LGD can be modelled using Bayesian methods. In the B...
Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua
2015-01-01
Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors. PMID:26274969
Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua
2015-08-01
Understanding the factors that affect residents' waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors. PMID:26274969
Directory of Open Access Journals (Sweden)
Dongliang Zhang
2015-08-01
Full Text Available Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB, this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors. The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
Fedosseev, V; Marsh, B A; CERN. Geneva. AB Department
2006-01-01
At the ISOLDE on-line isotope separation facility, the resonance ionization laser ion source (RILIS) can be used to ionize reaction products as they effuse from the target. The RILIS process of laser step-wise resonance ionization of atoms in a hot metal cavity provides a highly element selective stage in the preparation of the radioactive ion beam. As a result, the ISOLDE mass separators can provide beams of a chosen isotope with greatly reduced isobaric contamination. The number of elements available at RILIS has been extended to 26, with the addition of a new three-step ionization scheme for gold. The optimal ionization scheme was determined during an extensive study of the atomic energy levels and auto-ionizing states of gold, carried out by means of in-source resonance ionization spectroscopy. Details of the ionization scheme and a summary of the spectroscopy study are presented.
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... represents the spatial coordinates of the grid nodes. Knowledge of how grid nodes are depicted in the observed image is described through the observation model. The prior consists of a node prior and an arc (edge) prior, both modeled as Gaussian MRFs. The node prior models variations in the positions of grid...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...
MCMC joint separation and segmentation of hidden Markov fields
Snoussi, H; Snoussi, Hichem; Mohammad-Djafari, Ali
2002-01-01
In this contribution, we consider the problem of the blind separation of noisy instantaneously mixed images. The images are modelized by hidden Markov fields with unknown parameters. Given the observed images, we give a Bayesian formulation and we propose to solve the resulting data augmentation problem by implementing a Monte Carlo Markov Chain (MCMC) procedure. We separate the unknown variables into two categories: 1. The parameters of interest which are the mixing matrix, the noise covariance and the parameters of the sources distributions. 2. The hidden variables which are the unobserved sources and the unobserved pixels classification labels. The proposed algorithm provides in the stationary regime samples drawn from the posterior distributions of all the variables involved in the problem leading to a flexibility in the cost function choice. We discuss and characterize some problems of non identifiability and degeneracies of the parameters likelihood and the behavior of the MCMC algorithm in this case. F...
A Bayesian test for periodic signals in red noise
Vaughan, S
2009-01-01
Many astrophysical sources, especially compact accreting sources, show strong, random brightness fluctuations with broad power spectra in addition to periodic or quasi-periodic oscillations (QPOs) that have narrower spectra. The random nature of the dominant source of variance greatly complicates the process of searching for possible weak periodic signals. We have addressed this problem using the tools of Bayesian statistics; in particular using Markov chain Monte Carlo techniques to approxim...
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Ravat, D.; Kirkham, K.; Hildenbrand, T.G.
2002-01-01
An overview is given on the benefits of applying the Euler method on derivatives of anomalies to enhance the location of shallow and deep sources. Used properly, the method is suitable for characterizing sources from all potential-field data and/or their derivative, as long as the data can be regarded mathematically as "continuous". Furthermore, the reasons why the use of the Euler method on derivatives of anomalies is particularly helpful in the analysis and interpretation of shallow features are explained.
Bloomfield, John; Ward, Rob; Garcia-Bajo, Marieta; Hart, Alwyn
2014-05-01
A number of potential pathways can be identified for the migration of methane and contaminants associated with the shale gas extraction process to aquifers. These include the possible movement of contaminants from shale gas reservoirs that have been hydraulically fractured to overlying aquifers. The risk of contamination of an overlying aquifer is a function of i.) the separation of the potential shale gas source rock and the aquifer, ii.) the hydraulic characteristics (e.g. hydraulic conductivity, storage and hydrogeochemistry) of the rocks in the intervening interval, and iii.) regional and local physio-chemical gradients. Here we report on a national-scale study from the UK to assess the former, i.e. the vertical separation between potential shale gas source rocks and major aquifers, as a contribution to more informed management of the risks associated with shale gas development if and when it takes place in the UK. Eleven aquifers are considered in the study. These are aquifers that have been designated by the environment agencies of England (Environment Agency) and Wales (Natural Resources Wales) under the EU Water Framework Directive as being nationally important (Principal Aquifers). The shale gas source rocks have been defined on best publically available evidence for potential gas productivity and include both shales and clay formations. Based on a national geological fence diagram consisting of ~80 geological sections, totalling ~12,000km in length, down to >5km in depth, and with a typical spacing of 30km, the lower surfaces of each aquifer unit and upper surfaces of each shale/clay unit have been estimated at a spatial resolution of 3x3km. These surfaces have then been used to estimate vertical separations between pairs of shale/clay and aquifer units. The modelling process will be described and the aquifer, shale and separation maps presented and discussed. The aquifers are defined by geological units and since these geological units may be found at
Directory of Open Access Journals (Sweden)
Chengjie Li
2016-01-01
Full Text Available In Passive Radar System, obtaining the mixed weak object signal against the super power signal (jamming is still a challenging task. In this paper, a novel framework based on Passive Radar System is designed for weak object signal separation. Firstly, we propose an Interference Cancellation algorithm (IC-algorithm to extract the mixed weak object signals from the strong jamming. Then, an improved FastICA algorithm with K-means cluster is designed to separate each weak signal from the mixed weak object signals. At last, we discuss the performance of the proposed method and verify the novel method based on several simulations. The experimental results demonstrate the effectiveness of the proposed method.
A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics
Guangkuo Lu; Manlin Xiao; Ping Wei; Huaguo Zhang
2015-01-01
Methods of utilizing independent component analysis (ICA) give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments b...
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of...
Portfolio Allocation for Bayesian Optimization
Brochu, Eric; Hoffman, Matthew W.; De Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It uses Bayesian methods to sample the objective efficiently using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several differen...
Neuronanatomy, neurology and Bayesian networks
Bielza Lozoya, Maria Concepcion
2014-01-01
Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
Dale Poirier
2008-01-01
This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.
Energy Technology Data Exchange (ETDEWEB)
Baba, Justin S [ORNL; Akl, Tony [Texas A& M University; Cote, Gerard L. [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL
2011-01-01
An implanted system is being developed to monitor transplanted liver health during the critical 7-10 day period posttransplantation. The unit will monitor organ perfusion and oxygen consumption using optically-based probes placed on both the inflow and outflow blood vessels, and on the liver parenchymal surface. Sensing probes are based on a 3- wavelength LED source and a photodiode detector. Sample diffuse reflectance is measured at 735, 805, and 940 nm. To ascertain optimal source-to-photodetector spacing for perfusion measurement in blood vessels, an ex vivo study was conducted. In this work, a dye mixture simulating 80% blood oxygen saturation was developed and perfused through excised porcine arteries while collecting data for various preset probe source-to-photodetector spacings. The results from this study demonstrate a decrease in the optical signal with decreasing LED drive current and a reduction in perfusion index signal with increasing probe spacing. They also reveal a 2- to 4-mm optimal range for blood vessel perfusion probe source-to-photodetector spacing that allows for sufficient perfusion signal modulation depth with maximized signal to noise ratio (SNR). These findings are currently being applied to guide electronic configuration and probe placement for in vivo liver perfusion porcine model studies.
Uncertainty Modeling Based on Bayesian Network in Ontology Mapping
Institute of Scientific and Technical Information of China (English)
LI Yuhua; LIU Tao; SUN Xiaolin
2006-01-01
How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.
Bayesian Inference in the Modern Design of Experiments
DeLoach, Richard
2008-01-01
This paper provides an elementary tutorial overview of Bayesian inference and its potential for application in aerospace experimentation in general and wind tunnel testing in particular. Bayes Theorem is reviewed and examples are provided to illustrate how it can be applied to objectively revise prior knowledge by incorporating insights subsequently obtained from additional observations, resulting in new (posterior) knowledge that combines information from both sources. A logical merger of Bayesian methods and certain aspects of Response Surface Modeling is explored. Specific applications to wind tunnel testing, computational code validation, and instrumentation calibration are discussed.
Comparing Bayesian models for multisensory cue combination without mandatory integration
Beierholm, Ulrik R.; Shams, Ladan; Kording, Konrad P; Ma, Wei Ji
2009-01-01
Bayesian models of multisensory perception traditionally address the problem of estimating an underlying variable that is assumed to be the cause of the two sensory signals. The brain, however, has to solve a more general problem: it also has to establish which signals come from the same source and should be integrated, and which ones do not and should be segregated. In the last couple of years, a few models have been proposed to solve this problem in a Bayesian fashion. One of these ha...
Theory-independent limits on correlations from generalized Bayesian networks
International Nuclear Information System (INIS)
Bayesian networks provide a powerful tool for reasoning about probabilistic causation, used in many areas of science. They are, however, intrinsically classical. In particular, Bayesian networks naturally yield the Bell inequalities. Inspired by this connection, we generalize the formalism of classical Bayesian networks in order to investigate non-classical correlations in arbitrary causal structures. Our framework of ‘generalized Bayesian networks’ replaces latent variables with the resources of any generalized probabilistic theory, most importantly quantum theory, but also, for example, Popescu–Rohrlich boxes. We obtain three main sets of results. Firstly, we prove that all of the observable conditional independences required by the classical theory also hold in our generalization; to obtain this, we extend the classical d-separation theorem to our setting. Secondly, we find that the theory-independent constraints on probabilities can go beyond these conditional independences. For example we find that no probabilistic theory predicts perfect correlation between three parties using only bipartite common causes. Finally, we begin a classification of those causal structures, such as the Bell scenario, that may yield a separation between classical, quantum and general-probabilistic correlations. (paper)
Separation and quantification of frequency coupled noise sources of submarine cabin%舱段模型频率耦合噪声源的分离量化
Institute of Scientific and Technical Information of China (English)
李思纯; 宫元彬; 时胜国; 于树华; 韩闯
2016-01-01
Traditional methods do not effectively handle separation and quantification of coupled vibration noise sources in submarines. So a new multivariate statistical analysis method, partial least square regression ( PLS) , is presented, which can be used to separate and quantify frequency coupled noise sources. PLS has the characteristic of simultaneously extracting principal input/output components, including maximum information, correlation of in⁃put with output, and regression modeling with multiple correlations among variables. Simulation and cabin model experiments show that, when there is frequency coupling between multiple excitation sources, PLS is capable of sorting among the energy contributions of internal noise sources to submarine hull, submarine hull to underwater a⁃coustic field, and noise sources to underwater acoustic field. The feasibility of PLS for frequency coupled source separation and quantification is proven. The method provides a basis for the control of the main noise sources.%由于潜艇振动噪声源存在频率相互耦合现象，常规方法难以有效地解决耦合噪声源分离与贡献量化问题。采用一种新型多元统计分析方法－偏最小二乘回归分析方法来实现频率耦合噪声源的分离量化，该方法可同时提取反映输入／输出中最大信息且相关性最大的主成分，并能够在变量间存在多重相关性的条件下进行回归建模。仿真与舱段模型试验表明：当多激励源之间存在频率耦合时，能对噪声源进行分离和贡献量化，从而实现了噪声源对耐压壳体观测点贡献以及噪声源对辐射声场观测点贡献的排序，验证了偏最小二乘回归用于频率耦合源分离量化的可行性，为主要噪声源的控制提供了依据。
Energy Technology Data Exchange (ETDEWEB)
Chen, Yi-Ren; Chou, Li-Chang; Yang, Ying-Jay [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Lin, Hao-Hsiung, E-mail: hhlin@ntu.edu.tw [Graduate Institute of Electronics Engineering, National Taiwan University, Taipei 10617, Taiwan (China); Department of Electrical Engineering and Graduate Institute of Photonics and Optoelectronics, National Taiwan University, Taipei 10617, Taiwan (China)
2012-04-30
This work describes a regular solution model that considers the free energy of the surface monolayer to explain the orientation-dependent phase separation in GaAsSb. In the proposed model, only the interaction between the second nearest-neighboring atoms sitting on the same monolayer contributes to the interaction parameter. Consequently, the parameter reduces to {Omega}/2 and {Omega}/3 for (111)B GaAsSb and (100) GaAsSb, where {Omega} denotes the parameter of bulk GaAsSb. By including the strain effect, the proposed model thoroughly elucidates the immiscibility behavior of (111)B GaAsSb and (100) GaAsSb. - Highlights: Black-Right-Pointing-Pointer (111)B GaAsSb exhibits severe phase separation than (100) GaAsSb. Black-Right-Pointing-Pointer We propose a model to calculate the monolayer free energy of different planes. Black-Right-Pointing-Pointer Monolayer model suggests that (111)B GaAsSb has larger interaction parameter. Black-Right-Pointing-Pointer Monolayer model including strain well explains the immiscibility of GaAsSb.
Bayesian inference tools for inverse problems
Mohammad-Djafari, Ali
2013-08-01
In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the
Nonparametric Bayesian Classification
Coram, M A
2002-01-01
A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...
BAT - Bayesian Analysis Toolkit
International Nuclear Information System (INIS)
One of the most vital steps in any data analysis is the statistical analysis and comparison with the prediction of a theoretical model. The many uncertainties associated with the theoretical model and the observed data require a robust statistical analysis tool. The Bayesian Analysis Toolkit (BAT) is a powerful statistical analysis software package based on Bayes' Theorem, developed to evaluate the posterior probability distribution for models and their parameters. It implements Markov Chain Monte Carlo to get the full posterior probability distribution that in turn provides a straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as Simulated Annealing, allow to evaluate the global mode of the posterior. BAT is developed in C++ and allows for a flexible definition of models. A set of predefined models covering standard statistical cases are also included in BAT. It has been interfaced to other commonly used software packages such as ROOT, Minuit, RooStats and CUBA. An overview of the software and its algorithms is provided along with several physics examples to cover a range of applications of this statistical tool. Future plans, new features and recent developments are briefly discussed.
International Nuclear Information System (INIS)
The resonance ionization laser ion source (RILIS) of the ISOLDE on-line isotope separation facility is based on the method of laser stepwise resonance ionization of atoms in a hot metal cavity. The atomic selectivity of the RILIS compliments the mass selection process of the ISOLDE separator magnets to provide beams of a chosen isotope with greatly reduced isobaric contamination. Using a system of dye lasers pumped by copper vapor lasers, ion beams of 22 elements have been generated at ISOLDE with ionization efficiencies in the range of 0.5%-30%. As part of the ongoing RILIS development, recent off-line resonance ionization spectroscopy studies have determined the optimal three-step ionization schemes for yttrium, scandium, and antimony
Catherall, R.; Fedosseev, V. N.; Köster, U.; Lettry, J.; Suberlucq, G.; Marsh, B. A.; Tengborn, E.
2004-05-01
The resonance ionization laser ion source (RILIS) of the ISOLDE on-line isotope separation facility is based on the method of laser stepwise resonance ionization of atoms in a hot metal cavity. The atomic selectivity of the RILIS compliments the mass selection process of the ISOLDE separator magnets to provide beams of a chosen isotope with greatly reduced isobaric contamination. Using a system of dye lasers pumped by copper vapor lasers, ion beams of 22 elements have been generated at ISOLDE with ionization efficiencies in the range of 0.5%-30%. As part of the ongoing RILIS development, recent off-line resonance ionization spectroscopy studies have determined the optimal three-step ionization schemes for yttrium, scandium, and antimony.
Directory of Open Access Journals (Sweden)
Hiroshi Sawada
2007-01-01
Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the ℓ1-norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.
Fernandes, Ricardo; Millard, Andrew R.; Brabec, Marek; Nadeau, Marie-Josée; Grootes, Pieter
2014-01-01
Human and animal diet reconstruction studies that rely on tissue chemical signatures aim at providing estimates on the relative intake of potential food groups. However, several sources of uncertainty need to be considered when handling data. Bayesian mixing models provide a natural platform to handle diverse sources of uncertainty while allowing the user to contribute with prior expert information. The Bayesian mixing model FRUITS (Food Reconstruction Using Isotopic Transferred Signals) was ...
Energy Technology Data Exchange (ETDEWEB)
Naresh Shah; Frank E. Huggins; Gerald P. Huffman
2006-07-31
Coal combustion is generally viewed as a major source of PM2.5 emissions into the atmosphere. For some time, toxicologists have been asking for an exposure environment enriched with the coal combustion source specific PM{sub 2.5} to conduct meaningful exposure studies to better understand the mechanisms of the adverse health effects of coal combustion specific PM2.5 in the ambient environment. There are several unique characteristics of primary PM generated from coal combustion. In this research project, an attempt has been made to exploit some of the unique properties of PM generated from coal fired power plants to preferentially separate them out from the rest of the primary and secondary PM in the ambient environment. An existing FRM sampler used for monitoring amount of PM{sub 2.5} in the ambient air is modified to incorporate an electrostatic field. A DC corona charging device is also installed at the ambient air inlet to impart positive or negative charge to the PM. Visual Basic software has been written to simulate the lateral movement of PM as it passes through the electrostatic separator under varying operating conditions. The PM samples collected on polycarbonate filters under varying operating conditions were extensively observed for clustering and/or separation of PM in the direction parallel to the electric field. No systematic PM separation was observed under any of the operating conditions. A solution to overcome this kind of turbulence caused remixing has been offered. However, due to major programmatic changes in the DOE UCR program, there are no venues available to further pursue this research.
Magnetic Separation in Romania
Rezlescu, Nicolae; Bradu, Elena-Brandusa; Iacob, Gheorghe; Badescu, Vasile; Iacob, Lavinia
1986-01-01
The utilization of the magnetic separators of foreign and Romanian source is presented and the most important achievements in research, engineering design and manufacturing activity concerning the magnetic separation in Romania are reviewed.
Catherall, Richard; Köster, U; Lettry, Jacques; Suberlucq, Guy; Marsh, Bruce A; Tengborn, Elisabeth
2004-01-01
The production of radioactive ionization laser ion source (RILIS) of ISOLDE on-line isotope separation facility was investigated. The RILIS setup included three dye lasers and ionization schemes which employ three resonant transitions were also used. The RILIS efficiency could be reduced by nuclear effects such as hyperfine splitting and isotope shifts. The off-line resonance ionization spectroscopy determined optimal three-step ionization schemes for yttrium, scandium and antimony and antimony. The results show that best ionization schemes of Y provided gain factor of 15 with respect to surface ionization. (Edited abstract) 8 Refs.
Case studies in Bayesian microbial risk assessments
Directory of Open Access Journals (Sweden)
Turner Joanne
2009-12-01
Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second
International Nuclear Information System (INIS)
We report on a two-photon interference experiment in a quantum relay configuration using two picosecond regime periodically poled lithium niobate (PPLN) waveguide based sources emitting paired photons at 1550 nm. The results show that the picosecond regime associated with a guided-wave scheme should have important repercussions for quantum relay implementations in real conditions, essential for improving both the working distance and the efficiency of quantum cryptography and networking systems. In contrast to already reported regimes, namely, femtosecond and CW, it allows achieving a 99% net visibility two-photon interference while maintaining a high effective photon pair rate using only standard telecom components and detectors.
International Nuclear Information System (INIS)
The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 1010 ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions
Energy Technology Data Exchange (ETDEWEB)
Barminova, H. Y., E-mail: barminova@bk.ru; Saratovskyh, M. S. [National Research Nuclear University MEPhI, Kashirskoye sh. 31, Moscow 115409 (Russian Federation)
2016-02-15
The experiment automation system is supposed to be developed for experimental facility for material science at ITEP, based on a Bernas ion source. The program CAMFT is assumed to be involved into the program of the experiment automation. CAMFT is developed to simulate the intense charged particle bunch motion in the external magnetic fields with arbitrary geometry by means of the accurate solution of the particle motion equation. Program allows the consideration of the bunch intensity up to 10{sup 10} ppb. Preliminary calculations are performed at ITEP supercomputer. The results of the simulation of the beam pre-acceleration and following turn in magnetic field are presented for different initial conditions.
BayesWave: Bayesian Inference for Gravitational Wave Bursts and Instrument Glitches
Cornish, Neil J
2014-01-01
A central challenge in Gravitational Wave Astronomy is identifying weak signals in the presence of non-stationary and non-Gaussian noise. The separation of gravitational wave signals from noise requires good models for both. When accurate signal models are available, such as for binary Neutron star systems, it is possible to make robust detection statements even when the noise is poorly understood. In contrast, searches for "un-modeled" transient signals are strongly impacted by the methods used to characterize the noise. Here we take a Bayesian approach and introduce a multi-component, variable dimension, parameterized noise model that explicitly accounts for non-stationarity and non-Gaussianity in data from interferometric gravitational wave detectors. Instrumental transients (glitches) and burst sources of gravitational waves are modeled using a Morlet-Gabor continuous wavelet basis. The number and placement of the wavelets is determined by a trans-dimensional Reversible Jump Markov Chain Monte Carlo algor...
Energy Technology Data Exchange (ETDEWEB)
Bouriant, M. [Commissariat a l' Energie Atomique, Grenoble (France). Centre d' Etudes Nucleaires
1967-12-01
The production of high purity stable or radioactive isotopes ({>=} 99.99 per cent) using electromagnetic separation require for equipment having a high resolving power. Besides, and in order to collect rare or short half-life isotopes, the efficiency of the ion-source must be high ({eta} > 5 to 10 per cent). With this in view, the source built operates at high temperatures (2500-3000 C) and makes use of ionisation by electronic bombardment or of thermo-ionisation. A summary is given in the first part of this work on the essential characteristics of the isotope separator ion Sources; a diagram of the principle of the source built is then given together with its characteristics. In the second part are given the values of the resolving power and of the efficiency of the Grenoble isotope separator fitted with such a source. The resolving power measured at 10 per cent of the peak height is of the order of 200. At the first magnetic stage the efficiency is between 1 and 26 per cent for a range of elements evaporating between 200 and 3000 C. Thus equipped, the separator has for example given, at the first stage, 10 mg of {sup 180}Hf at (99.69 {+-} 0.1) per cent corresponding to an enrichment coefficient of 580; recently 2 mg of {sup 150}Nd at (99.996 {+-} 0.002) per cent corresponding to an enrichment coefficient of 4.2 x 10{sup 5} has been obtained at the second stage. (author) [French] La production d'isotopes stables ou radioactifs de haute purete isotopique ({>=} 99.99 pour cent), par separation electromagnetique, exige des appareils de haut pouvoir de resolution. En outre, et en vue de collecter des isotopes de tres faible abondance ou de periode tres courte, le rendement des sources d'ions doit etre eleve ({eta} > 5 a 10 pour cent). Dans ce but, la source realisee fonctionne a haute temperature (2500-3000 C) et utilise l'ionisation par bombardement electronique, ou la thermoionisation. Dans la premiere partie de ce travail, on resume d'abord les
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Flood quantile estimation at ungauged sites by Bayesian networks
Mediero, L.; Santillán, D.; Garrote, L.
2012-04-01
Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a
Saito, Yoshiyuki; Yasuhara, Masakatsu; Mabuchi, Yuichi; Matsushima, Tohlu; Hisakado, Takashi; Wada, Osami
An EMC macro-model for LSIs, named the LECCS-core model, is under development for simulating high frequency noise in power supply currents. In this paper, the conventional LECCS-core model is extended by adding resistances in the ground connection of an LSI, in order to separate the core block and the analog block. The model parameters are identified using symbolic analysis and least-square optimization. Using this new model, the transfer impedances between different power supply pins can be simulated accurately. Additionally we derived the equivalent internal current sources by using that model. As a result, we confirmed that the internal current sources were improved. In conclusion, we confirmed that the configuration of the linear equivalent circuit and our modeling method can be applied widely to microcontrollers of the same block configuration.
A New Method of Blind Source Separation Using Single-Channel ICA Based on Higher-Order Statistics
Directory of Open Access Journals (Sweden)
Guangkuo Lu
2015-01-01
Full Text Available Methods of utilizing independent component analysis (ICA give little guidance about practical considerations for separating single-channel real-world data, in which most of them are nonlinear, nonstationary, and even chaotic in many fields. To solve this problem, a three-step method is provided in this paper. In the first step, the measured signal which is assumed to be piecewise higher order stationary time series is introduced and divided into a series of higher order stationary segments by applying a modified segmentation algorithm. Then the state space is reconstructed and the single-channel signal is transformed into a pseudo multiple input multiple output (MIMO mode using a method of nonlinear analysis based on the high order statistics (HOS. In the last step, ICA is performed on the pseudo MIMO data to decompose the single channel recording into its underlying independent components (ICs and the interested ICs are then extracted. Finally, the effectiveness and excellence of the higher order single-channel ICA (SCICA method are validated with measured data throughout experiments. Also, the proposed method in this paper is proved to be more robust under different SNR and/or embedding dimension via explicit formulae and simulations.
International Nuclear Information System (INIS)
This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community
Shashilov, V. A.; Lednev, I. K.
2007-11-01
Amyloid fibrils are associated with many neurodegenerative diseases. The application of conventional biophysical techniques including solution NMR and X-ray crystallography for structural characterization of fibrils is limited because they are neither crystalline nor soluble. The Bayesian approach was utilized for extracting the deep UV resonance Raman (DUVRR) spectrum of the lysozyme fibrillar β-sheet based on the hydrogen-deuterium exchange spectral data. The problem was shown to be unsolvable when using blind source separation or conventional chemometrics methods because of the 100% correlation of the concentration profiles of the species under study. Information about the mixing process was incorporated by forcing the columns of the concentration matrix to be proportional to the expected concentration profiles. The ill-conditioning of the matrix was removed by concatenating it to the diagonal matrix with entries corresponding to the known pure spectra (sources). Prior information about the spectral features and characteristic bands of the spectra was taken into account using the Bayesian signal dictionary approach. The extracted DUVRR spectrum of the cross-β sheet core exhibited sharp bands indicating the highly ordered structure. Well resolved sub-bands in Amide I and Amide III regions enabled us to assign the fibril core structure to anti-parallel β-sheet and estimate the amide group facial angle Ψ in the cross-β structure. The elaborated Bayesian approach was demonstrated to be applicable for studying correlated biochemical processes.
Directory of Open Access Journals (Sweden)
Hawkins Steve AC
2006-10-01
Full Text Available Abstract Background Given the theoretical proposal that bovine spongiform encephalopathy (BSE could have originated from sheep scrapie, this study investigated the pathogenicity for cattle, by intracerebral (i.c. inoculation, of two pools of scrapie agents sourced in Great Britain before and during the BSE epidemic. Two groups of ten cattle were each inoculated with pools of brain material from sheep scrapie cases collected prior to 1975 and after 1990. Control groups comprised five cattle inoculated with sheep brain free from scrapie, five cattle inoculated with saline, and for comparison with BSE, naturally infected cattle and cattle i.c. inoculated with BSE brainstem homogenate from a parallel study. Phenotypic characterisation of the disease forms transmitted to cattle was conducted by morphological, immunohistochemical, biochemical and biological methods. Results Disease occurred in 16 cattle, nine inoculated with the pre-1975 inoculum and seven inoculated with the post-1990 inoculum, with four cattle still alive at 83 months post challenge (as at June 2006. The different inocula produced predominantly two different disease phenotypes as determined by histopathological, immunohistochemical and Western immunoblotting methods and biological characterisation on transmission to mice, neither of which was identical to BSE. Whilst the disease presentation was uniform in all scrapie-affected cattle of the pre-1975 group, the post-1990 inoculum produced a more variable disease, with two animals sharing immunohistochemical and molecular profile characteristics with animals in the pre-1975 group. Conclusion The study has demonstrated that cattle inoculated with different pooled scrapie sources can develop different prion disease phenotypes, which were not consistent with the phenotype of BSE of cattle and whose isolates did not have the strain typing characteristics of the BSE agent on transmission to mice.
International Nuclear Information System (INIS)
Composting and digestion are important waste management strategies. However, the resulting products can contain significant amounts of organic pollutants such as polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs). In this study we followed the concentration changes of PCBs and PAHs during composting and digestion on field-scale for the first time. Concentrations of low-chlorinated PCBs increased during composting (about 30%), whereas a slight decrease was observed for the higher chlorinated congeners (about 10%). Enantiomeric fractions of atropisomeric PCBs were essentially racemic and stable over time. Levels of low-molecular-weight PAHs declined during composting (50-90% reduction), whereas high-molecular-weight compounds were stable. The PCBs and PAHs concentrations did not seem to vary during digestion. Source apportionment by applying characteristic PAH ratios and molecular markers in input material did not give any clear results. Some of these parameters changed considerably during composting. Hence, their diagnostic potential for finished compost must be questioned. - During field-scale composting, low molecular weight PCBs and PAHs increased and decreased, respectively, whereas high molecular weight compounds remained stable
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
A Bayesian estimation of the helioseismic solar age
Bonanno, Alfio
2015-01-01
The helioseismic determination of the solar age has been a subject of several studies because it provides us with an independent estimation of the age of the solar system. We present the Bayesian estimates of the helioseismic age of the Sun, which are determined by means of calibrated solar models that employ different equations of state and nuclear reaction rates. We use 17 frequency separation ratios $r_{02}(n)=(\
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Le Grandois, Julie; Marchioni, Eric; Zhao, Minjie; Giuffrida, Francesca; Ennahar, Saïd; Bindler, Françoise
2009-07-22
This study is a contribution to the exploration of natural phospholipid (PL) sources rich in long-chain polyunsaturated fatty acids (LC-PUFAs) with nutritional interest. Phosphatidylcholines (PCs) were purified from total lipid extracts of different food matrices, and their molecular species were separated and identified by liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS(2)). Fragmentation of lithiated adducts allowed for the identification of fatty acids linked to the glycerol backbone. Soy PC was particularly rich in species containing essential fatty acids, such as (18:2-18:2)PC (34.0%), (16:0-18:2)PC (20.8%), and (18:1-18:2)PC (16.3%). PC from animal sources (ox liver and egg yolk) contained major molecular species, such as (16:0-18:2)PC, (16:0-18:1)PC, (18:0-18:2)PC, or (18:0-18:1)PC. Finally, marine source (krill oil), which was particularly rich in (16:0-20:5)PC and (16:0-22:6)PC, appeared to be an interesting potential source for food supplementation with LC-PUFA-PLs, particularly eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA). PMID:19545117
利用盲源分离算法实现DOA估计%DOA estimation method based on blind source separation algorithm
Institute of Scientific and Technical Information of China (English)
徐先峰; 刘义艳; 段晨东
2012-01-01
A new DOA (direction-of-arrival) estimation method based on an algorithm for fast blind source separation (FBSS-DOA) is proposed in this paper. A group of correlation matrices possessing diagonal structure is generated. A cost function of joint diagonalization for blind source separation is introduced. For solving this cost function, a fast multiplied iterative algorithm in complex-valued domain is utilized. The demixing matrix was then estimated and the estimation of DOA was realized. Compared with familiar algorithms, the algorithm has more generality and better estimation performance. The simulation results illustrate its efficiency.%提出一种基于快速盲源分离算法实现波达方向(DOA)估计的方法.构造了具有对角化结构的相关矩阵组,引入解盲源分离问题的联合对角化代价函数,采用一种快速的复数域乘性迭代算法求解代价函数,得到混迭矩阵逆的估计,进而实现DOA估计.与同类算法相比,该算法具有更广的适用性和更精确的DOA估计性能.仿真实验结果验证了算法的快速收敛性和优越的估计性能.
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
Bayesian variable selection with spherically symmetric priors
De Kock, M B
2014-01-01
We propose that Bayesian variable selection for linear parametrisations with Gaussian iid likelihoods be based on the spherical symmetry of the diagonalised parameter space. This reduces the multidimensional parameter space problem to one dimension without the need for conjugate priors. Combining this likelihood with what we call the r-prior results in a framework in which we can derive closed forms for the evidence, posterior and characteristic function for four different r-priors, including the hyper-g prior and the Zellner-Siow prior, which are shown to be special cases of our r-prior. Two scenarios of a single variable dispersion parameter and of fixed dispersion are studied separately, and asymptotic forms comparable to the traditional information criteria are derived. In a simple simulation exercise, we find that model comparison based on our uniform r-prior appears to fare better than the current model comparison schemes.
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Bayesian variable order Markov models: Towards Bayesian predictive state representations
C. Dimitrakakis
2009-01-01
We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st
Phycas: software for Bayesian phylogenetic analysis.
Lewis, Paul O; Holder, Mark T; Swofford, David L
2015-05-01
Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605
Merging Digital Surface Models Implementing Bayesian Approaches
Sadeq, H.; Drummond, J.; Li, Z.
2016-06-01
In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.
Bayesian Analysis of Experimental Data
Directory of Open Access Journals (Sweden)
Lalmohan Bhar
2013-10-01
Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in...
ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY
Directory of Open Access Journals (Sweden)
Felipe Schneider Costa
2013-01-01
Full Text Available The naÃ¯ve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naÃ¯ve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naÃ¯ve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naÃ¯ve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.
Bayesian Agglomerative Clustering with Coalescents
Teh, Yee Whye; Daumé III, Hal; Roy, Daniel
2009-01-01
We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
2015-01-01
Sources Fondation Pablo Iglesias. Alcala de Henares. Sections : Archives privées de Manuel ArijaArchives extérieuresArchives FNJS de EspañaPrensa Archives Générales de l’Administration. Alcala de Henares. Sections : Opposition au franquismeSig. 653 Sig TOP 82/68.103-68.602.Índice de las cartas colectivas, Relaciones, Cartas al Ministro de Información de Marzo de 1965. c.662. Sources cinématographiques Filmothèque Nationale d’Espagne.NO.DO. N° 1157C. 08/03/1965.aguirre Javier, Blanco vertical....
Bayesian analysis of rare events
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...
Energy Technology Data Exchange (ETDEWEB)
Amirov, R. Kh., E-mail: ravus46@yandex.ru; Vorona, N. A.; Gavrikov, A. V.; Lizyakin, G. D.; Polishchuk, V. P.; Samoilov, I. S.; Smirnov, V. P.; Usmanov, R. A.; Yartsev, I. M. [Russian Academy of Sciences, Joint Institute for High Temperatures (Russian Federation)
2015-10-15
Results from experimental studies of a vacuum arc with a distributed cathode spot on the heated cathode are presented. Such an arc can be used as a plasma source for plasma separation of spent nuclear fuel and radioactive waste. The experiments were performed with a gadolinium cathode, the properties of which are similar to those of an uranium arc cathode. The heat flux from the plasma to the cathode (and its volt equivalent) at discharge voltages of 4-15 V and discharge currents of 44-81 A, the radial distribution of the emission intensity of gadolinium atoms and singly charged ions in the arc channel at a voltage of 4.3 V, and the plasma electron temperature behind the anode were measured. The average charge of plasma ions at arc voltages of 3.5-8 V and a discharge current of 52 A and the average rate of gadolinium evaporation in the discharge were also determined.
International Nuclear Information System (INIS)
A Wien filter was designed for and tested with a room temperature electron beam ion source (EBIS). Xenon charge state spectra up to the charge state Xe46+ were resolved as well as the isotopes of krypton using apertures of different sizes. The complete setup consisting of an EBIS and a Wien filter has a length of less than 1 m substituting a complete classical beamline setup. The Wien filter is equipped with removable permanent magnets. Hence total beam current measurements are possible via simple removal of the permanent magnets. In dependence on the needs of resolution a weak (0.2 T) or a strong (0.5 T) magnets setup can be used. In this paper the principle of operation and the design of the Wien filter meeting the requirements of an EBIS are briefly discussed. The first ion beam extraction and separation experiments with a Dresden EBIS are presented.
Schmidt, M; Peng, H; Zschornack, G; Sykora, S
2009-06-01
A Wien filter was designed for and tested with a room temperature electron beam ion source (EBIS). Xenon charge state spectra up to the charge state Xe46+ were resolved as well as the isotopes of krypton using apertures of different sizes. The complete setup consisting of an EBIS and a Wien filter has a length of less than 1 m substituting a complete classical beamline setup. The Wien filter is equipped with removable permanent magnets. Hence total beam current measurements are possible via simple removal of the permanent magnets. In dependence on the needs of resolution a weak (0.2 T) or a strong (0.5 T) magnets setup can be used. In this paper the principle of operation and the design of the Wien filter meeting the requirements of an EBIS are briefly discussed. The first ion beam extraction and separation experiments with a Dresden EBIS are presented. PMID:19566197
Tsyganov, Y S
2015-01-01
General philosophy of procedure of detecting rare events in the recent experiments with 48Ca projectile at the Dubna Gas-Filled Recoil Separator(DGFRS) aimed to the synthesis of superheavy elements (SHE) has been reviewed. Specific instruments and methods are under consideration. Some historical sources of the successful experiments for Z=112-118 are considered too. Special attention is paid to application of method of active correlations in heavy-ion induced complete fusion nuclear reactions. Example of application in Z=115 experiment is presented. Brief description of the 243Am + 48Ca -> 291-x115+xn experiment is presented too. Some attention is paid to the role of chemical experiments in discoveries of SHEs. The DGFRS detection/monitoring system is presented in full firstly.
单观测通道船舶辐射噪声盲源分离%Blind source separation of ship-radiated noise using single observing channel
Institute of Scientific and Technical Information of China (English)
刘佳; 杨士莪; 朴胜春; 黄益旺
2011-01-01
提出了一种适用于单观测通道的船舶辐射噪声盲源分离方法.该方法依据船舶辐射噪声远场的空间分布规律,通过将单观测通道延时和滤波的方法构造虚拟通道,使单通道转化为多通道,以实现单通道的盲源分离.仿真及实验数据分析的结果显示,分离后信号的相关系数在不同信噪比下有稳定的提高,说明该方法能在一定程度上利用单观测通道在海洋环境噪声背景下分离船舶辐射噪声,实验数据分析同时表明该方法对双目标船的分离也有一定效果.%A method of blind source separation for ship-radiated noise is proposed based on single observing channel.According to the spatial characteristic of ship-radiated noise in far field, a virtual channel is constructed from the observed channel by time delay and data filtering. It overcomes the limitation of channel numbers. The simulation and experimental results using this method are presented. The results show that under the sea ambient noise the shipradiated noise can be separated by this method with single observing channel, and it is effective for two object ships to be separated.
Amirov, R. Kh.; Vorona, N. A.; Gavrikov, A. V.; Liziakin, G. D.; Polistchook, V. P.; Samoylov, I. S.; Smirnov, V. P.; Usmanov, R. A.; Yartsev, I. M.
2015-12-01
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics.
International Nuclear Information System (INIS)
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics
Energy Technology Data Exchange (ETDEWEB)
Amirov, R. Kh.; Vorona, N. A.; Gavrikov, A. V.; Liziakin, G. D.; Polistchook, V. P.; Samoylov, I. S.; Smirnov, V. P.; Usmanov, R. A., E-mail: ravus46@yandex.ru; Yartsev, I. M. [Russian Academy of Sciences, Joint Institute for High Temperatures (Russian Federation)
2015-12-15
One of the key problems in the development of plasma separation technology is designing a plasma source which uses condensed spent nuclear fuel (SNF) or nuclear wastes as a raw material. This paper covers the experimental study of the evaporation and ionization of model materials (gadolinium, niobium oxide, and titanium oxide). For these purposes, a vacuum arc with a heated cathode on the studied material was initiated and its parameters in different regimes were studied. During the experiment, the cathode temperature, arc current, arc voltage, and plasma radiation spectra were measured, and also probe measurements were carried out. It was found that the increase in the cathode heating power leads to the decrease in the arc voltage (to 3 V). This fact makes it possible to reduce the electron energy and achieve singly ionized plasma with a high degree of ionization to fulfill one of the requirements for plasma separation of SNF. This finding is supported by the analysis of the plasma radiation spectrum and the results of the probe diagnostics.
Huang, Pei; Mukherji, Sachiyo T; Wu, Sha; Muller, James; Goel, Ramesh
2016-10-15
Recently, research on source separation followed by the treatment of urine and/or resource recovery from human urine has shown promise as an emerging management strategy. Despite contributing only 1% of the total volume of wastewater, human urine contributes about 80% of the nitrogen, 70% of the potassium, and up to 50% of the total phosphorus in wastewater. It is also a known fact that many of the micropollutants, especially selected estrogens, get into municipal wastewater through urine excretion. In this research, we investigated the fate of 17β-estradiol (E2) as a model estrogen during struvite precipitation from synthetic urine followed by the treatment of urine using a partial nitritation-anammox (PN/A) system. Single-stage and two-stage suspended growth PN/A configurations were used to remove the nitrogen in urine after struvite precipitation. The results showed an almost 95% phosphorous and 5% nitrogen recovery/removal from the synthetic urine due to struvite precipitation. The single and two stage PN/A processes were able to remove around 50% and 75% of ammonia and nitrogen present in the post struvite urine solution, respectively. After struvite precipitation, more than 95% of the E2 remained in solution and the transformation of E2 to E1 happened during urine storage. Most of the E2 removal that occurred during the PN/A process was due to sorption on the biomass and biodegradation (transformation of E2 to E1, and slow degradation of E1 to other metabolites). These results demonstrate that a combination of chemical and biological unit processes will be needed to recover and manage nutrients in source separated urine. PMID:27566951
Bayesian calibration of simultaneity in audiovisual temporal order judgments.
Directory of Open Access Journals (Sweden)
Shinya Yamamoto
Full Text Available After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation. In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to bayesian integration theory (bayesian calibration. We further showed, in theory, that the effect of bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone in a different block, the point of simultaneity shifted to "sound-first" for the pitch associated with sound-first stimuli, and to "light-first" for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to "light-first" for the pitch associated with sound-first stimuli, and to "sound-first" for the pitch associated with light-first stimuli. The results clearly show that bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli.
Confirmation via Analogue Simulation: A Bayesian Analysis
Dardashti, Radin; Thebault, Karim P Y; Winsberg, Eric
2016-01-01
Analogue simulation is a novel mode of scientific inference found increasingly within modern physics, and yet all but neglected in the philosophical literature. Experiments conducted upon a table-top 'source system' are taken to provide insight into features of an inaccessible 'target system', based upon a syntactic isomorphism between the relevant modelling frameworks. An important example is the use of acoustic 'dumb hole' systems to simulate gravitational black holes. In a recent paper it was argued that there exists circumstances in which confirmation via analogue simulation can obtain; in particular when the robustness of the isomorphism is established via universality arguments. The current paper supports these claims via an analysis in terms of Bayesian confirmation theory.
Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods
Zhu, Weixuan
2016-01-01
The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...
Bayesian decision making in human collectives with binary choices
Eguíluz, Víctor M; Fernández-Gracia, J
2015-01-01
Here we focus on the description of the mechanisms behind the process of information aggregation and decision making, a basic step to understand emergent phenomena in society, such as trends, information spreading or the wisdom of crowds. In many situations, agents choose between discrete options. We analyze experimental data on binary opinion choices in humans. The data consists of two separate experiments in which humans answer questions with a binary response, where one is correct and the other is incorrect. The questions are answered without and with information on the answers of some previous participants. We find that a Bayesian approach captures the probability of choosing one of the answers. The influence of peers is uncorrelated with the difficulty of the question. The data is inconsistent with Weber's law, which states that the probability of choosing an option depends on the proportion of previous answers choosing that option and not on the total number of those answers. Last, the present Bayesian ...